persistent-2.10.5.2/Database/0000755000000000000000000000000013464141060014030 5ustar0000000000000000persistent-2.10.5.2/Database/Persist/0000755000000000000000000000000013621633057015470 5ustar0000000000000000persistent-2.10.5.2/Database/Persist/Class/0000755000000000000000000000000013621317264016534 5ustar0000000000000000persistent-2.10.5.2/Database/Persist/Sql/0000755000000000000000000000000013620641325016223 5ustar0000000000000000persistent-2.10.5.2/Database/Persist/Sql/Orphan/0000755000000000000000000000000013621624146017455 5ustar0000000000000000persistent-2.10.5.2/Database/Persist/Sql/Types/0000755000000000000000000000000013464141060017324 5ustar0000000000000000persistent-2.10.5.2/Database/Persist/Types/0000755000000000000000000000000013555610075016575 5ustar0000000000000000persistent-2.10.5.2/test/0000755000000000000000000000000013621633057013312 5ustar0000000000000000persistent-2.10.5.2/Database/Persist.hs0000644000000000000000000002436313464141060016025 0ustar0000000000000000{-# LANGUAGE ExistentialQuantification #-} module Database.Persist ( module Database.Persist.Class , module Database.Persist.Types -- * Reference Schema & Dataset -- | -- -- All the combinators present here will be explained based on this schema: -- -- > share [mkPersist sqlSettings, mkMigrate "migrateAll"] [persistLowerCase| -- > User -- > name String -- > age Int -- > deriving Show -- > |] -- -- and this dataset. The examples below will refer to this as dataset-1. -- -- #dataset# -- -- > +-----+-----+-----+ -- > |id |name |age | -- > +-----+-----+-----+ -- > |1 |SPJ |40 | -- > +-----+-----+-----+ -- > |2 |Simon|41 | -- > +-----+-----+-----+ -- * Query update combinators , (=.), (+=.), (-=.), (*=.), (/=.) -- * Query filter combinators , (==.), (!=.), (<.), (>.), (<=.), (>=.), (<-.), (/<-.), (||.) -- * JSON Utilities , listToJSON , mapToJSON , toJsonText , getPersistMap -- * Other utilities , limitOffsetOrder ) where import Data.Aeson (toJSON, ToJSON) import Data.Aeson.Text (encodeToTextBuilder) import qualified Data.Text as T import Data.Text.Lazy (toStrict) import Data.Text.Lazy.Builder (toLazyText) import Database.Persist.Types import Database.Persist.Class import Database.Persist.Class.PersistField (getPersistMap) infixr 3 =., +=., -=., *=., /=. (=.), (+=.), (-=.), (*=.), (/=.) :: forall v typ. PersistField typ => EntityField v typ -> typ -> Update v -- | Assign a field a value. -- -- === __Example usage__ -- -- @ -- updateAge :: MonadIO m => ReaderT SqlBackend m () -- updateAge = updateWhere [UserName ==. \"SPJ\" ] [UserAge =. 45] -- @ -- -- Similar to `updateWhere` which is shown in the above example you can use other functions present in the module "Database.Persist.Class". Note that the first parameter of `updateWhere` is [`Filter` val] and second parameter is [`Update` val]. By comparing this with the type of `==.` and `=.`, you can see that they match up in the above usage. -- -- The above query when applied on <#dataset dataset-1>, will produce this: -- -- > +-----+-----+--------+ -- > |id |name |age | -- > +-----+-----+--------+ -- > |1 |SPJ |40 -> 45| -- > +-----+-----+--------+ -- > |2 |Simon|41 | -- > +-----+-----+--------+ f =. a = Update f a Assign -- | Assign a field by addition (@+=@). -- -- === __Example usage__ -- -- @ -- addAge :: MonadIO m => ReaderT SqlBackend m () -- addAge = updateWhere [UserName ==. \"SPJ\" ] [UserAge +=. 1] -- @ -- -- The above query when applied on <#dataset dataset-1>, will produce this: -- -- > +-----+-----+---------+ -- > |id |name |age | -- > +-----+-----+---------+ -- > |1 |SPJ |40 -> 41 | -- > +-----+-----+---------+ -- > |2 |Simon|41 | -- > +-----+-----+---------+ f +=. a = Update f a Add -- | Assign a field by subtraction (@-=@). -- -- === __Example usage__ -- -- @ -- subtractAge :: MonadIO m => ReaderT SqlBackend m () -- subtractAge = updateWhere [UserName ==. \"SPJ\" ] [UserAge -=. 1] -- @ -- -- The above query when applied on <#dataset dataset-1>, will produce this: -- -- > +-----+-----+---------+ -- > |id |name |age | -- > +-----+-----+---------+ -- > |1 |SPJ |40 -> 39 | -- > +-----+-----+---------+ -- > |2 |Simon|41 | -- > +-----+-----+---------+ f -=. a = Update f a Subtract -- | Assign a field by multiplication (@*=@). -- -- === __Example usage__ -- -- @ -- multiplyAge :: MonadIO m => ReaderT SqlBackend m () -- multiplyAge = updateWhere [UserName ==. \"SPJ\" ] [UserAge *=. 2] -- @ -- -- The above query when applied on <#dataset dataset-1>, will produce this: -- -- > +-----+-----+--------+ -- > |id |name |age | -- > +-----+-----+--------+ -- > |1 |SPJ |40 -> 80| -- > +-----+-----+--------+ -- > |2 |Simon|41 | -- > +-----+-----+--------+ f *=. a = Update f a Multiply -- | Assign a field by division (@/=@). -- -- === __Example usage__ -- -- @ -- divideAge :: MonadIO m => ReaderT SqlBackend m () -- divideAge = updateWhere [UserName ==. \"SPJ\" ] [UserAge /=. 2] -- @ -- -- The above query when applied on <#dataset dataset-1>, will produce this: -- -- > +-----+-----+---------+ -- > |id |name |age | -- > +-----+-----+---------+ -- > |1 |SPJ |40 -> 20 | -- > +-----+-----+---------+ -- > |2 |Simon|41 | -- > +-----+-----+---------+ f /=. a = Update f a Divide infix 4 ==., <., <=., >., >=., !=. (==.), (!=.), (<.), (<=.), (>.), (>=.) :: forall v typ. PersistField typ => EntityField v typ -> typ -> Filter v -- | Check for equality. -- -- === __Example usage__ -- -- @ -- selectSPJ :: MonadIO m => ReaderT SqlBackend m [Entity User] -- selectSPJ = selectList [UserName ==. \"SPJ\" ] [] -- @ -- -- The above query when applied on <#dataset dataset-1>, will produce this: -- -- > +-----+-----+-----+ -- > |id |name |age | -- > +-----+-----+-----+ -- > |1 |SPJ |40 | -- > +-----+-----+-----+ f ==. a = Filter f (FilterValue a) Eq -- | Non-equality check. -- -- === __Example usage__ -- -- @ -- selectSimon :: MonadIO m => ReaderT SqlBackend m [Entity User] -- selectSimon = selectList [UserName !=. \"SPJ\" ] [] -- @ -- -- The above query when applied on <#dataset dataset-1>, will produce this: -- -- > +-----+-----+-----+ -- > |id |name |age | -- > +-----+-----+-----+ -- > |2 |Simon|41 | -- > +-----+-----+-----+ f !=. a = Filter f (FilterValue a) Ne -- | Less-than check. -- -- === __Example usage__ -- -- @ -- selectLessAge :: MonadIO m => ReaderT SqlBackend m [Entity User] -- selectLessAge = selectList [UserAge <. 41 ] [] -- @ -- -- The above query when applied on <#dataset dataset-1>, will produce this: -- -- > +-----+-----+-----+ -- > |id |name |age | -- > +-----+-----+-----+ -- > |1 |SPJ |40 | -- > +-----+-----+-----+ f <. a = Filter f (FilterValue a) Lt -- | Less-than or equal check. -- -- === __Example usage__ -- -- @ -- selectLessEqualAge :: MonadIO m => ReaderT SqlBackend m [Entity User] -- selectLessEqualAge = selectList [UserAge <=. 40 ] [] -- @ -- -- The above query when applied on <#dataset dataset-1>, will produce this: -- -- > +-----+-----+-----+ -- > |id |name |age | -- > +-----+-----+-----+ -- > |1 |SPJ |40 | -- > +-----+-----+-----+ f <=. a = Filter f (FilterValue a) Le -- | Greater-than check. -- -- === __Example usage__ -- -- @ -- selectGreaterAge :: MonadIO m => ReaderT SqlBackend m [Entity User] -- selectGreaterAge = selectList [UserAge >. 40 ] [] -- @ -- -- The above query when applied on <#dataset dataset-1>, will produce this: -- -- > +-----+-----+-----+ -- > |id |name |age | -- > +-----+-----+-----+ -- > |2 |Simon|41 | -- > +-----+-----+-----+ f >. a = Filter f (FilterValue a) Gt -- | Greater-than or equal check. -- -- === __Example usage__ -- -- @ -- selectGreaterEqualAge :: MonadIO m => ReaderT SqlBackend m [Entity User] -- selectGreaterEqualAge = selectList [UserAge >=. 41 ] [] -- @ -- -- The above query when applied on <#dataset dataset-1>, will produce this: -- -- > +-----+-----+-----+ -- > |id |name |age | -- > +-----+-----+-----+ -- > |2 |Simon|41 | -- > +-----+-----+-----+ f >=. a = Filter f (FilterValue a) Ge infix 4 <-., /<-. (<-.), (/<-.) :: forall v typ. PersistField typ => EntityField v typ -> [typ] -> Filter v -- | Check if value is in given list. -- -- === __Example usage__ -- -- @ -- selectUsers :: MonadIO m => ReaderT SqlBackend m [Entity User] -- selectUsers = selectList [UserAge <-. [40, 41]] [] -- @ -- -- The above query when applied on <#dataset dataset-1>, will produce this: -- -- > +-----+-----+-----+ -- > |id |name |age | -- > +-----+-----+-----+ -- > |1 |SPJ |40 | -- > +-----+-----+-----+ -- > |2 |Simon|41 | -- > +-----+-----+-----+ -- -- -- @ -- selectSPJ :: MonadIO m => ReaderT SqlBackend m [Entity User] -- selectSPJ = selectList [UserAge <-. [40]] [] -- @ -- -- The above query when applied on <#dataset dataset-1>, will produce this: -- -- > +-----+-----+-----+ -- > |id |name |age | -- > +-----+-----+-----+ -- > |1 |SPJ |40 | -- > +-----+-----+-----+ f <-. a = Filter f (FilterValues a) In -- | Check if value is not in given list. -- -- === __Example usage__ -- -- @ -- selectSimon :: MonadIO m => ReaderT SqlBackend m [Entity User] -- selectSimon = selectList [UserAge /<-. [40]] [] -- @ -- -- The above query when applied on <#dataset dataset-1>, will produce this: -- -- > +-----+-----+-----+ -- > |id |name |age | -- > +-----+-----+-----+ -- > |2 |Simon|41 | -- > +-----+-----+-----+ f /<-. a = Filter f (FilterValues a) NotIn infixl 3 ||. (||.) :: forall v. [Filter v] -> [Filter v] -> [Filter v] -- | The OR of two lists of filters. For example: -- -- > selectList -- > ([ PersonAge >. 25 -- > , PersonAge <. 30 ] ||. -- > [ PersonIncome >. 15000 -- > , PersonIncome <. 25000 ]) -- > [] -- -- will filter records where a person's age is between 25 and 30 /or/ a -- person's income is between (15000 and 25000). -- -- If you are looking for an @(&&.)@ operator to do @(A AND B AND (C OR D))@ -- you can use the @(++)@ operator instead as there is no @(&&.)@. For -- example: -- -- > selectList -- > ([ PersonAge >. 25 -- > , PersonAge <. 30 ] ++ -- > ([PersonCategory ==. 1] ||. -- > [PersonCategory ==. 5])) -- > [] -- -- will filter records where a person's age is between 25 and 30 /and/ -- (person's category is either 1 or 5). a ||. b = [FilterOr [FilterAnd a, FilterAnd b]] -- | Convert list of 'PersistValue's into textual representation of JSON -- object. This is a type-constrained synonym for 'toJsonText'. listToJSON :: [PersistValue] -> T.Text listToJSON = toJsonText -- | Convert map (list of tuples) into textual representation of JSON -- object. This is a type-constrained synonym for 'toJsonText'. mapToJSON :: [(T.Text, PersistValue)] -> T.Text mapToJSON = toJsonText -- | A more general way to convert instances of `ToJSON` type class to -- strict text 'T.Text'. toJsonText :: ToJSON j => j -> T.Text toJsonText = toStrict . toLazyText . encodeToTextBuilder . toJSON -- | FIXME What's this exactly? limitOffsetOrder :: PersistEntity val => [SelectOpt val] -> (Int, Int, [SelectOpt val]) limitOffsetOrder opts = foldr go (0, 0, []) opts where go (LimitTo l) (_, b, c) = (l, b ,c) go (OffsetBy o) (a, _, c) = (a, o, c) go x (a, b, c) = (a, b, x : c) persistent-2.10.5.2/Database/Persist/Quasi.hs0000644000000000000000000010142113621633057017105 0ustar0000000000000000{-# LANGUAGE BangPatterns, CPP #-} {-# LANGUAGE RankNTypes #-} {-# LANGUAGE PatternGuards #-} {-# LANGUAGE StandaloneDeriving, UndecidableInstances #-} {-# LANGUAGE ViewPatterns #-} {-| This module defines the Persistent entity syntax used in the quasiquoter to generate persistent entities. The basic structure of the syntax looks like this: > TableName > fieldName FieldType > otherField String > nullableField Int Maybe You start an entity definition with the table name, in this case, @TableName@. It's followed by a list of fields on the entity, which have the basic form @fieldName FieldType@. You can indicate that a field is nullable with 'Maybe' at the end of the type. @persistent@ automatically generates an ID column for you, if you don't specify one, so the above table definition corresponds to the following SQL: > CREATE TABLE table_name ( > id SERIAL PRIMARY KEY, > field_name field_type NOT NULL, > other_field varchar NOT NULL, > nullable_field int NULL > ); Note that the exact SQL that is generated can be customized using the 'PersistSettings' that are passed to the 'parse' function. It generates a Haskell datatype with the following form: @ data TableName = TableName { tableNameFieldName :: FieldType , tableNameOtherField :: String , tableNameNullableField :: Maybe Int } @ As with the SQL generated, the specifics of this are customizable. See the @persistent-template@ package for details. = Deriving You can add a deriving clause to a table, and the generated Haskell type will have a deriving clause with that. Unlike normal Haskell syntax, you don't need parentheses or commas to separate the classes, and you can even have multiple deriving clauses. > User > name String > age Int > deriving Eq Show > deriving Ord = Unique Keys You can define a uniqueness key on a table with the following format: > User > name String > age Int > > UniqueUserName name This will put a unique index on the @user@ table and the @name@ field. = Setting defaults You can use a @default=${sql expression}@ clause to set a default for a field. The thing following the `=` is interpreted as SQL that will be put directly into the table definition. @ User name Text admin Bool default=false @ This creates a SQL definition like this: > CREATE TABLE user ( > id SERIAL PRIMARY KEY, > name VARCHAR NOT NULL, > admin BOOL DEFAULT=false > ); A restriction here is that you still need to provide a value when performing an `insert`, because the generated Haskell type has the form: @ data User = User { userName :: Text , userAdmin :: Bool } @ You can work around this by using a 'Maybe Bool' and supplying 'Nothing' by default. = Custom ID column If you don't want to use the default ID column type of 'Int64', you can set a custom type with an @Id@ field. This @User@ has a @Text@ ID. > User > Id Text > name Text > age Int If you do this, it's a good idea to set a default for the ID. Otherwise, you will need to use 'insertKey' instead of 'insert' when performing inserts. @ 'insertKey' (UserKey "Hello world!") (User "Bob" 32) @ If you attempt to do @'insert' (User "Bob" 32)@, then you will receive a runtime error because the SQL database doesn't know how to make an ID for you anymore. So instead just use a default expression, like this: @ User Id Text default=generate_user_id() name Text age Int @ = Custom Primary Keys Sometimes you don't want to have an ID column, and you want a different sort of primary key. This is a table that stores unique email addresses, and the email is the primary key. We store the first and second part (eg @first\@second@) separately. @ Email firstPart Text secondPart Text Primary firstPart secondPart @ This creates a table with the following form: @ CREATE TABLE email ( first_part varchar, second_part varchar, PRIMARY KEY (first_part, second_part) @ You can specify 1 or more columns in the primary key. = Overriding SQL You can use a @sql=custom@ annotation to provide some customization on the entity and field. For example, you might prefer to name a table differently than what @persistent@ will do by default. You may also prefer to name a field differently. @ User sql=big_user_table fullName String sql=name age Int @ This will alter the generated SQL to be: @ CREATE TABEL big_user_table ( id SERIAL PRIMARY KEY, name VARCHAR, age INT ); @ = Attributes The QuasiQuoter allows you to provide arbitrary attributes to an entity or field. This can be used to extend the code in ways that the library hasn't anticipated. If you use this feature, we'd definitely appreciate hearing about it and potentially supporting your use case directly! @ User !funny field String !sad good Dog !sogood @ We can see the attributes using the 'entityAttrs' field and the 'fieldAttrs' field. @ userAttrs = do let userDefinition = 'entityDef' ('Proxy' :: 'Proxy' User) let userAttributes = 'entityAttrs' userDefinition let fieldAttributes = 'map' 'fieldAttrs' ('entityFields' userDefinition) print userAttributes -- ["funny"] print fieldAttributes -- [["sad"],["sogood"]] @ = Documentation Comments The quasiquoter supports ordinary comments with @--@ and @#@. Since @persistent-2.10.5.1@, it also supports documentation comments. The grammar for documentation comments is similar to Haskell's Haddock syntax, with a few restrictions: 1. Only the @-- | @ form is allowed. 2. You must put a space before and after the @|@ pipe character. 3. The comment must be indented at the same level as the entity or field it documents. An example of the field documentation is: @ -- | I am a doc comment for a User. Users are important -- | to the application, and should be treasured. User -- | Users have names. Call them by names. name String -- | A user can be old, or young, and we care about -- | this for some reason. age Int @ The documentation is present on the `entityComments` field on the `EntityDef` for the entity: @ >>> let userDefinition = entityDef (Proxy :: Proxy User) >>> entityComments userDefinition "I am a doc comment for a User. Users are important\nto the application, and should be treasured.\n" @ Likewise, the field documentation is present in the `fieldComments` field on the `FieldDef` present in the `EntityDef`: @ >>> let userFields = entityFields userDefinition >>> let comments = map fieldComments userFields >>> mapM_ putStrLn comments "Users have names. Call them by names." "A user can be old, or young, and we care about\nthis for some reason." @ Unfortunately, we can't use this to create Haddocks for you, because . `persistent` backends *can* use this to generate SQL @COMMENT@s, which are useful for a database perspective, and you can use the library to render a Markdown document of the entity definitions. -} module Database.Persist.Quasi ( parse , PersistSettings (..) , upperCaseSettings , lowerCaseSettings , nullable #if TEST , Token (..) , Line' (..) , preparse , tokenize , parseFieldType , empty , removeSpaces , associateLines , skipEmpty , LinesWithComments(..) #endif ) where import Prelude hiding (lines) import qualified Data.List.NonEmpty as NEL import Data.List.NonEmpty (NonEmpty(..)) import Control.Arrow ((&&&)) import Control.Monad (msum, mplus) import Data.Char import Data.List (find, foldl') import qualified Data.Map as M import Data.Maybe (mapMaybe, fromMaybe, maybeToList, listToMaybe) import Data.Monoid (mappend) import Data.Text (Text) import qualified Data.Text as T import Database.Persist.Types data ParseState a = PSDone | PSFail String | PSSuccess a Text deriving Show parseFieldType :: Text -> Either String FieldType parseFieldType t0 = case parseApplyFT t0 of PSSuccess ft t' | T.all isSpace t' -> Right ft PSFail err -> Left $ "PSFail " ++ err other -> Left $ show other where parseApplyFT t = case goMany id t of PSSuccess (ft:fts) t' -> PSSuccess (foldl' FTApp ft fts) t' PSSuccess [] _ -> PSFail "empty" PSFail err -> PSFail err PSDone -> PSDone parseEnclosed :: Char -> (FieldType -> FieldType) -> Text -> ParseState FieldType parseEnclosed end ftMod t = let (a, b) = T.break (== end) t in case parseApplyFT a of PSSuccess ft t' -> case (T.dropWhile isSpace t', T.uncons b) of ("", Just (c, t'')) | c == end -> PSSuccess (ftMod ft) (t'' `Data.Monoid.mappend` t') (x, y) -> PSFail $ show (b, x, y) x -> PSFail $ show x parse1 t = case T.uncons t of Nothing -> PSDone Just (c, t') | isSpace c -> parse1 $ T.dropWhile isSpace t' | c == '(' -> parseEnclosed ')' id t' | c == '[' -> parseEnclosed ']' FTList t' | isUpper c -> let (a, b) = T.break (\x -> isSpace x || x `elem` ("()[]"::String)) t in PSSuccess (getCon a) b | otherwise -> PSFail $ show (c, t') getCon t = case T.breakOnEnd "." t of (_, "") -> FTTypeCon Nothing t ("", _) -> FTTypeCon Nothing t (a, b) -> FTTypeCon (Just $ T.init a) b goMany front t = case parse1 t of PSSuccess x t' -> goMany (front . (x:)) t' PSFail err -> PSFail err PSDone -> PSSuccess (front []) t -- _ -> data PersistSettings = PersistSettings { psToDBName :: !(Text -> Text) , psStrictFields :: !Bool -- ^ Whether fields are by default strict. Default value: @True@. -- -- @since 1.2 , psIdName :: !Text -- ^ The name of the id column. Default value: @id@ -- The name of the id column can also be changed on a per-model basis -- -- -- @since 2.0 } defaultPersistSettings, upperCaseSettings, lowerCaseSettings :: PersistSettings defaultPersistSettings = PersistSettings { psToDBName = id , psStrictFields = True , psIdName = "id" } upperCaseSettings = defaultPersistSettings lowerCaseSettings = defaultPersistSettings { psToDBName = let go c | isUpper c = T.pack ['_', toLower c] | otherwise = T.singleton c in T.dropWhile (== '_') . T.concatMap go } -- | Parses a quasi-quoted syntax into a list of entity definitions. parse :: PersistSettings -> Text -> [EntityDef] parse ps = parseLines ps . preparse preparse :: Text -> [Line] preparse = removeSpaces . filter (not . empty) . map tokenize . T.lines -- | A token used by the parser. data Token = Spaces !Int -- ^ @Spaces n@ are @n@ consecutive spaces. | Token Text -- ^ @Token tok@ is token @tok@ already unquoted. | DocComment Text -- ^ @DocComment@ is a documentation comment, unmodified. deriving (Show, Eq) -- | Tokenize a string. tokenize :: Text -> [Token] tokenize t | T.null t = [] | "-- | " `T.isPrefixOf` t = [DocComment t] | "--" `T.isPrefixOf` t = [] -- Comment until the end of the line. | "#" `T.isPrefixOf` t = [] -- Also comment to the end of the line, needed for a CPP bug (#110) | T.head t == '"' = quotes (T.tail t) id | T.head t == '(' = parens 1 (T.tail t) id | isSpace (T.head t) = let (spaces, rest) = T.span isSpace t in Spaces (T.length spaces) : tokenize rest -- support mid-token quotes and parens | Just (beforeEquals, afterEquals) <- findMidToken t , not (T.any isSpace beforeEquals) , Token next : rest <- tokenize afterEquals = Token (T.concat [beforeEquals, "=", next]) : rest | otherwise = let (token, rest) = T.break isSpace t in Token token : tokenize rest where findMidToken t' = case T.break (== '=') t' of (x, T.drop 1 -> y) | "\"" `T.isPrefixOf` y || "(" `T.isPrefixOf` y -> Just (x, y) _ -> Nothing quotes t' front | T.null t' = error $ T.unpack $ T.concat $ "Unterminated quoted string starting with " : front [] | T.head t' == '"' = Token (T.concat $ front []) : tokenize (T.tail t') | T.head t' == '\\' && T.length t' > 1 = quotes (T.drop 2 t') (front . (T.take 1 (T.drop 1 t'):)) | otherwise = let (x, y) = T.break (`elem` ['\\','\"']) t' in quotes y (front . (x:)) parens count t' front | T.null t' = error $ T.unpack $ T.concat $ "Unterminated parens string starting with " : front [] | T.head t' == ')' = if count == (1 :: Int) then Token (T.concat $ front []) : tokenize (T.tail t') else parens (count - 1) (T.tail t') (front . (")":)) | T.head t' == '(' = parens (count + 1) (T.tail t') (front . ("(":)) | T.head t' == '\\' && T.length t' > 1 = parens count (T.drop 2 t') (front . (T.take 1 (T.drop 1 t'):)) | otherwise = let (x, y) = T.break (`elem` ['\\','(',')']) t' in parens count y (front . (x:)) -- | A string of tokens is empty when it has only spaces. There -- can't be two consecutive 'Spaces', so this takes /O(1)/ time. empty :: [Token] -> Bool empty [] = True empty [Spaces _] = True empty _ = False -- | A line. We don't care about spaces in the middle of the -- line. Also, we don't care about the ammount of indentation. data Line' f = Line { lineIndent :: Int , tokens :: f Text } deriving instance Show (f Text) => Show (Line' f) deriving instance Eq (f Text) => Eq (Line' f) mapLine :: (forall x. f x -> g x) -> Line' f -> Line' g mapLine k (Line i t) = Line i (k t) traverseLine :: Functor t => (forall x. f x -> t (g x)) -> Line' f -> t (Line' g) traverseLine k (Line i xs) = Line i <$> k xs type Line = Line' [] -- | Remove leading spaces and remove spaces in the middle of the -- tokens. removeSpaces :: [[Token]] -> [Line] removeSpaces = map toLine where toLine (Spaces i:rest) = toLine' i rest toLine xs = toLine' 0 xs toLine' i = Line i . mapMaybe fromToken fromToken (Token t) = Just t fromToken (DocComment t) = Just t fromToken Spaces{} = Nothing -- | Divide lines into blocks and make entity definitions. parseLines :: PersistSettings -> [Line] -> [EntityDef] parseLines ps lines = fixForeignKeysAll $ toEnts lines where toEnts :: [Line] -> [UnboundEntityDef] toEnts = map mk . associateLines . skipEmpty mk :: LinesWithComments -> UnboundEntityDef mk lwc = let Line _ (name :| entAttribs) :| rest = lwcLines lwc in setComments (lwcComments lwc) $ mkEntityDef ps name entAttribs (map (mapLine NEL.toList) rest) isComment :: Text -> Maybe Text isComment xs = T.stripPrefix "-- | " xs data LinesWithComments = LinesWithComments { lwcLines :: NonEmpty (Line' NonEmpty) , lwcComments :: [Text] } deriving (Eq, Show) -- TODO: drop this and use <> when 8.2 isn't supported anymore so the -- monoid/semigroup nonsense isn't annoying appendLwc :: LinesWithComments -> LinesWithComments -> LinesWithComments appendLwc a b = LinesWithComments (foldr NEL.cons (lwcLines b) (lwcLines a)) (lwcComments a `mappend` lwcComments b) newLine :: Line' NonEmpty -> LinesWithComments newLine l = LinesWithComments (pure l) [] firstLine :: LinesWithComments -> Line' NonEmpty firstLine = NEL.head . lwcLines consLine :: Line' NonEmpty -> LinesWithComments -> LinesWithComments consLine l lwc = lwc { lwcLines = NEL.cons l (lwcLines lwc) } consComment :: Text -> LinesWithComments -> LinesWithComments consComment l lwc = lwc { lwcComments = l : lwcComments lwc } associateLines :: [Line' NonEmpty] -> [LinesWithComments] associateLines lines = foldr combine [] $ foldr toLinesWithComments [] lines where toLinesWithComments line linesWithComments = case linesWithComments of [] -> [newLine line] (lwc : lwcs) -> case isComment (NEL.head (tokens line)) of Just comment | lineIndent line == lowestIndent -> consComment comment lwc : lwcs _ -> if lineIndent line <= lineIndent (firstLine lwc) then consLine line lwc : lwcs else newLine line : lwc : lwcs lowestIndent = minimum . fmap lineIndent $ lines combine :: LinesWithComments -> [LinesWithComments] -> [LinesWithComments] combine lwc [] = [lwc] combine lwc (lwc' : lwcs) = let minIndent = minimumIndentOf lwc otherIndent = minimumIndentOf lwc' in if minIndent < otherIndent then appendLwc lwc lwc' : lwcs else lwc : lwc' : lwcs minimumIndentOf = minimum . fmap lineIndent . lwcLines skipEmpty :: [Line' []] -> [Line' NonEmpty] skipEmpty = mapMaybe (traverseLine NEL.nonEmpty) setComments :: [Text] -> UnboundEntityDef -> UnboundEntityDef setComments [] = id setComments comments = overUnboundEntityDef (\ed -> ed { entityComments = Just (T.unlines comments) }) fixForeignKeysAll :: [UnboundEntityDef] -> [EntityDef] fixForeignKeysAll unEnts = map fixForeignKeys unEnts where ents = map unboundEntityDef unEnts entLookup = M.fromList $ map (\e -> (entityHaskell e, e)) ents fixForeignKeys :: UnboundEntityDef -> EntityDef fixForeignKeys (UnboundEntityDef foreigns ent) = ent { entityForeigns = map (fixForeignKey ent) foreigns } -- check the count and the sqltypes match and update the foreignFields with the names of the primary columns fixForeignKey :: EntityDef -> UnboundForeignDef -> ForeignDef fixForeignKey ent (UnboundForeignDef foreignFieldTexts fdef) = let pentError = error $ "could not find table " ++ show (foreignRefTableHaskell fdef) ++ " fdef=" ++ show fdef ++ " allnames=" ++ show (map (unHaskellName . entityHaskell . unboundEntityDef) unEnts) ++ "\n\nents=" ++ show ents pent = fromMaybe pentError $ M.lookup (foreignRefTableHaskell fdef) entLookup in case entityPrimary pent of Just pdef -> if length foreignFieldTexts /= length (compositeFields pdef) then lengthError pdef else let fds_ffs = zipWith (toForeignFields pent) foreignFieldTexts (compositeFields pdef) dbname = unDBName (entityDB pent) oldDbName = unDBName (foreignRefTableDBName fdef) in fdef { foreignFields = map snd fds_ffs , foreignNullable = setNull $ map fst fds_ffs , foreignRefTableDBName = DBName dbname , foreignConstraintNameDBName = DBName . T.replace oldDbName dbname . unDBName $ foreignConstraintNameDBName fdef } Nothing -> error $ "no explicit primary key fdef="++show fdef++ " ent="++show ent where setNull :: [FieldDef] -> Bool setNull [] = error "setNull: impossible!" setNull (fd:fds) = let nullSetting = isNull fd in if all ((nullSetting ==) . isNull) fds then nullSetting else error $ "foreign key columns must all be nullable or non-nullable" ++ show (map (unHaskellName . fieldHaskell) (fd:fds)) isNull = (NotNullable /=) . nullable . fieldAttrs toForeignFields pent fieldText pfd = case chktypes fd haskellField (entityFields pent) pfh of Just err -> error err Nothing -> (fd, ((haskellField, fieldDB fd), (pfh, pfdb))) where fd = getFd (entityFields ent) haskellField haskellField = HaskellName fieldText (pfh, pfdb) = (fieldHaskell pfd, fieldDB pfd) chktypes :: FieldDef -> HaskellName -> [FieldDef] -> HaskellName -> Maybe String chktypes ffld _fkey pflds pkey = if fieldType ffld == fieldType pfld then Nothing else Just $ "fieldType mismatch: " ++ show (fieldType ffld) ++ ", " ++ show (fieldType pfld) where pfld = getFd pflds pkey entName = entityHaskell ent getFd [] t = error $ "foreign key constraint for: " ++ show (unHaskellName entName) ++ " unknown column: " ++ show t getFd (f:fs) t | fieldHaskell f == t = f | otherwise = getFd fs t lengthError pdef = error $ "found " ++ show (length foreignFieldTexts) ++ " fkeys and " ++ show (length (compositeFields pdef)) ++ " pkeys: fdef=" ++ show fdef ++ " pdef=" ++ show pdef data UnboundEntityDef = UnboundEntityDef { _unboundForeignDefs :: [UnboundForeignDef] , unboundEntityDef :: EntityDef } overUnboundEntityDef :: (EntityDef -> EntityDef) -> UnboundEntityDef -> UnboundEntityDef overUnboundEntityDef f ubed = ubed { unboundEntityDef = f (unboundEntityDef ubed) } lookupKeyVal :: Text -> [Text] -> Maybe Text lookupKeyVal key = lookupPrefix $ key `mappend` "=" lookupPrefix :: Text -> [Text] -> Maybe Text lookupPrefix prefix = msum . map (T.stripPrefix prefix) -- | Construct an entity definition. mkEntityDef :: PersistSettings -> Text -- ^ name -> [Attr] -- ^ entity attributes -> [Line] -- ^ indented lines -> UnboundEntityDef mkEntityDef ps name entattribs lines = UnboundEntityDef foreigns $ EntityDef { entityHaskell = entName , entityDB = DBName $ getDbName ps name' entattribs -- idField is the user-specified Id -- otherwise useAutoIdField -- but, adjust it if the user specified a Primary , entityId = (setComposite primaryComposite $ fromMaybe autoIdField idField) , entityAttrs = entattribs , entityFields = cols , entityUniques = uniqs , entityForeigns = [] , entityDerives = derives , entityExtra = extras , entitySum = isSum , entityComments = comments } where comments = Nothing entName = HaskellName name' (isSum, name') = case T.uncons name of Just ('+', x) -> (True, x) _ -> (False, name) (attribs, extras) = splitExtras lines attribPrefix = flip lookupKeyVal entattribs idName | Just _ <- attribPrefix "id" = error "id= is deprecated, ad a field named 'Id' and use sql=" | otherwise = Nothing (idField, primaryComposite, uniqs, foreigns) = foldl' (\(mid, mp, us, fs) attr -> let (i, p, u, f) = takeConstraint ps name' cols attr squish xs m = xs `mappend` maybeToList m in (just1 mid i, just1 mp p, squish us u, squish fs f)) (Nothing, Nothing, [],[]) attribs derives = concat $ mapMaybe takeDerives attribs cols :: [FieldDef] cols = reverse . fst . foldr k ([], []) $ reverse attribs k x (!acc, !comments) = case isComment =<< listToMaybe x of Just comment -> (acc, comment : comments) Nothing -> ( maybe id (:) (setFieldComments comments <$> takeColsEx ps x) acc , [] ) setFieldComments [] x = x setFieldComments xs fld = fld { fieldComments = Just (T.unlines xs) } autoIdField = mkAutoIdField ps entName (DBName `fmap` idName) idSqlType idSqlType = maybe SqlInt64 (const $ SqlOther "Primary Key") primaryComposite setComposite Nothing fd = fd setComposite (Just c) fd = fd { fieldReference = CompositeRef c } just1 :: (Show x) => Maybe x -> Maybe x -> Maybe x just1 (Just x) (Just y) = error $ "expected only one of: " `mappend` show x `mappend` " " `mappend` show y just1 x y = x `mplus` y mkAutoIdField :: PersistSettings -> HaskellName -> Maybe DBName -> SqlType -> FieldDef mkAutoIdField ps entName idName idSqlType = FieldDef { fieldHaskell = HaskellName "Id" -- this should be modeled as a Maybe -- but that sucks for non-ID field -- TODO: use a sumtype FieldDef | IdFieldDef , fieldDB = fromMaybe (DBName $ psIdName ps) idName , fieldType = FTTypeCon Nothing $ keyConName $ unHaskellName entName , fieldSqlType = idSqlType -- the primary field is actually a reference to the entity , fieldReference = ForeignRef entName defaultReferenceTypeCon , fieldAttrs = [] , fieldStrict = True , fieldComments = Nothing } defaultReferenceTypeCon :: FieldType defaultReferenceTypeCon = FTTypeCon (Just "Data.Int") "Int64" keyConName :: Text -> Text keyConName entName = entName `mappend` "Id" splitExtras :: [Line] -> ([[Text]], M.Map Text [[Text]]) splitExtras [] = ([], M.empty) splitExtras (Line indent [name]:rest) | not (T.null name) && isUpper (T.head name) = let (children, rest') = span ((> indent) . lineIndent) rest (x, y) = splitExtras rest' in (x, M.insert name (map tokens children) y) splitExtras (Line _ ts:rest) = let (x, y) = splitExtras rest in (ts:x, y) takeColsEx :: PersistSettings -> [Text] -> Maybe FieldDef takeColsEx = takeCols (\ft perr -> error $ "Invalid field type " ++ show ft ++ " " ++ perr) takeCols :: (Text -> String -> Maybe FieldDef) -> PersistSettings -> [Text] -> Maybe FieldDef takeCols _ _ ("deriving":_) = Nothing takeCols onErr ps (n':typ:rest) | not (T.null n) && isLower (T.head n) = case parseFieldType typ of Left err -> onErr typ err Right ft -> Just FieldDef { fieldHaskell = HaskellName n , fieldDB = DBName $ getDbName ps n rest , fieldType = ft , fieldSqlType = SqlOther $ "SqlType unset for " `mappend` n , fieldAttrs = rest , fieldStrict = fromMaybe (psStrictFields ps) mstrict , fieldReference = NoReference , fieldComments = Nothing } where (mstrict, n) | Just x <- T.stripPrefix "!" n' = (Just True, x) | Just x <- T.stripPrefix "~" n' = (Just False, x) | otherwise = (Nothing, n') takeCols _ _ _ = Nothing getDbName :: PersistSettings -> Text -> [Text] -> Text getDbName ps n [] = psToDBName ps n getDbName ps n (a:as) = fromMaybe (getDbName ps n as) $ T.stripPrefix "sql=" a takeConstraint :: PersistSettings -> Text -> [FieldDef] -> [Text] -> (Maybe FieldDef, Maybe CompositeDef, Maybe UniqueDef, Maybe UnboundForeignDef) takeConstraint ps tableName defs (n:rest) | not (T.null n) && isUpper (T.head n) = takeConstraint' where takeConstraint' | n == "Unique" = (Nothing, Nothing, Just $ takeUniq ps tableName defs rest, Nothing) | n == "Foreign" = (Nothing, Nothing, Nothing, Just $ takeForeign ps tableName defs rest) | n == "Primary" = (Nothing, Just $ takeComposite defs rest, Nothing, Nothing) | n == "Id" = (Just $ takeId ps tableName (n:rest), Nothing, Nothing, Nothing) | otherwise = (Nothing, Nothing, Just $ takeUniq ps "" defs (n:rest), Nothing) -- retain compatibility with original unique constraint takeConstraint _ _ _ _ = (Nothing, Nothing, Nothing, Nothing) -- TODO: this is hacky (the double takeCols, the setFieldDef stuff, and setIdName. -- need to re-work takeCols function takeId :: PersistSettings -> Text -> [Text] -> FieldDef takeId ps tableName (n:rest) = fromMaybe (error "takeId: impossible!") $ setFieldDef $ takeCols (\_ _ -> addDefaultIdType) ps (field:rest `mappend` setIdName) where field = case T.uncons n of Nothing -> error "takeId: empty field" Just (f, ield) -> toLower f `T.cons` ield addDefaultIdType = takeColsEx ps (field : keyCon : rest `mappend` setIdName) setFieldDef = fmap (\fd -> let refFieldType = if fieldType fd == FTTypeCon Nothing keyCon then defaultReferenceTypeCon else fieldType fd in fd { fieldReference = ForeignRef (HaskellName tableName) $ refFieldType }) keyCon = keyConName tableName -- this will be ignored if there is already an existing sql= -- TODO: I think there is a ! ignore syntax that would screw this up setIdName = ["sql=" `mappend` psIdName ps] takeId _ tableName _ = error $ "empty Id field for " `mappend` show tableName takeComposite :: [FieldDef] -> [Text] -> CompositeDef takeComposite fields pkcols = CompositeDef (map (getDef fields) pkcols) attrs where (_, attrs) = break ("!" `T.isPrefixOf`) pkcols getDef [] t = error $ "Unknown column in primary key constraint: " ++ show t getDef (d:ds) t | fieldHaskell d == HaskellName t = if nullable (fieldAttrs d) /= NotNullable then error $ "primary key column cannot be nullable: " ++ show t else d | otherwise = getDef ds t -- Unique UppercaseConstraintName list of lowercasefields terminated -- by ! or sql= such that a unique constraint can look like: -- `UniqueTestNull fieldA fieldB sql=ConstraintNameInDatabase !force` -- Here using sql= sets the name of the constraint. takeUniq :: PersistSettings -> Text -> [FieldDef] -> [Text] -> UniqueDef takeUniq ps tableName defs (n:rest) | not (T.null n) && isUpper (T.head n) = UniqueDef (HaskellName n) dbName (map (HaskellName &&& getDBName defs) fields) attrs where isAttr a = "!" `T.isPrefixOf` a isSqlName a = "sql=" `T.isPrefixOf` a isNonField a = isAttr a || isSqlName a (fields, nonFields) = break isNonField rest attrs = filter isAttr nonFields usualDbName = DBName $ psToDBName ps (tableName `T.append` n) sqlName :: Maybe DBName sqlName = case find isSqlName nonFields of Nothing -> Nothing (Just t) -> case drop 1 $ T.splitOn "=" t of (x : _) -> Just (DBName x) _ -> Nothing dbName = fromMaybe usualDbName sqlName getDBName [] t = error $ "Unknown column in unique constraint: " ++ show t ++ " " ++ show defs ++ show n ++ " " ++ show attrs getDBName (d:ds) t | fieldHaskell d == HaskellName t = fieldDB d | otherwise = getDBName ds t takeUniq _ tableName _ xs = error $ "invalid unique constraint on table[" ++ show tableName ++ "] expecting an uppercase constraint name xs=" ++ show xs data UnboundForeignDef = UnboundForeignDef { _unboundFields :: [Text] -- ^ fields in other entity , _unboundForeignDef :: ForeignDef } takeForeign :: PersistSettings -> Text -> [FieldDef] -> [Text] -> UnboundForeignDef takeForeign ps tableName _defs (refTableName:n:rest) | not (T.null n) && isLower (T.head n) = UnboundForeignDef fields $ ForeignDef { foreignRefTableHaskell = HaskellName refTableName , foreignRefTableDBName = DBName $ psToDBName ps refTableName , foreignConstraintNameHaskell = HaskellName n , foreignConstraintNameDBName = DBName $ psToDBName ps (tableName `T.append` n) , foreignFields = [] , foreignAttrs = attrs , foreignNullable = False } where (fields,attrs) = break ("!" `T.isPrefixOf`) rest takeForeign _ tableName _ xs = error $ "invalid foreign key constraint on table[" ++ show tableName ++ "] expecting a lower case constraint name xs=" ++ show xs takeDerives :: [Text] -> Maybe [Text] takeDerives ("deriving":rest) = Just rest takeDerives _ = Nothing nullable :: [Text] -> IsNullable nullable s | "Maybe" `elem` s = Nullable ByMaybeAttr | "nullable" `elem` s = Nullable ByNullableAttr | otherwise = NotNullable persistent-2.10.5.2/Database/Persist/Types.hs0000644000000000000000000000061613620641325017127 0ustar0000000000000000module Database.Persist.Types ( module Database.Persist.Types.Base , SomePersistField (..) , Update (..) , BackendSpecificUpdate , SelectOpt (..) , Filter (..) , FilterValue (..) , BackendSpecificFilter , Key , Entity (..) ) where import Database.Persist.Types.Base import Database.Persist.Class.PersistField import Database.Persist.Class.PersistEntity persistent-2.10.5.2/Database/Persist/Class.hs0000644000000000000000000001011513464141060017060 0ustar0000000000000000{-# LANGUAGE ConstraintKinds #-} module Database.Persist.Class ( ToBackendKey (..) -- * PersistStore -- | -- -- All the examples present here will be explained based on these schemas, datasets and functions: -- -- = schema-1 -- -- #schema-persist-store-1# -- -- > share [mkPersist sqlSettings, mkMigrate "migrateAll"] [persistLowerCase| -- > User -- > name String -- > age Int -- > deriving Show -- > |] -- -- = dataset-1 -- -- #dataset-persist-store-1# -- -- > +----+-------+-----+ -- > | id | name | age | -- > +----+-------+-----+ -- > | 1 | SPJ | 40 | -- > +----+-------+-----+ -- > | 2 | Simon | 41 | -- > +----+-------+-----+ , PersistCore (..) , PersistStore , PersistStoreRead (..) , PersistStoreWrite (..) , PersistRecordBackend , getJust , getJustEntity , getEntity , belongsTo , belongsToJust , insertEntity , insertRecord -- * PersistUnique -- | -- -- All the examples present here will be explained based on these two schemas and the dataset: -- -- = schema-1 -- This schema has single unique constraint. -- -- #schema-persist-unique-1# -- -- > share [mkPersist sqlSettings, mkMigrate "migrateAll"] [persistLowerCase| -- > User -- > name String -- > age Int -- > UniqueUserName name -- > deriving Show -- > |] -- -- = schema-2 -- This schema has two unique constraints. -- -- #schema-persist-unique-2# -- -- > share [mkPersist sqlSettings, mkMigrate "migrateAll"] [persistLowerCase| -- > User -- > name String -- > age Int -- > UniqueUserName name -- > UniqueUserAge age -- > deriving Show -- > |] -- -- = dataset-1 -- -- #dataset-persist-unique-1# -- -- > +-----+-----+-----+ -- > |id |name |age | -- > +-----+-----+-----+ -- > |1 |SPJ |40 | -- > +-----+-----+-----+ -- > |2 |Simon|41 | -- > +-----+-----+-----+ , PersistUnique , PersistUniqueRead (..) , PersistUniqueWrite (..) , OnlyOneUniqueKey (..) , AtLeastOneUniqueKey (..) , NoUniqueKeysError , MultipleUniqueKeysError , getByValue , insertBy , insertUniqueEntity , replaceUnique , checkUnique , onlyUnique -- * PersistQuery , PersistQuery , PersistQueryRead (..) , PersistQueryWrite (..) , selectSource , selectKeys , selectList , selectKeysList -- * DeleteCascade , DeleteCascade (..) , deleteCascadeWhere -- * PersistEntity , PersistEntity (..) -- * PersistField , PersistField (..) -- * PersistConfig , PersistConfig (..) , entityValues -- * Lifting , HasPersistBackend (..) , IsPersistBackend () , liftPersist , BackendCompatible (..) -- * JSON utilities , keyValueEntityToJSON, keyValueEntityFromJSON , entityIdToJSON, entityIdFromJSON , toPersistValueJSON, fromPersistValueJSON ) where import Database.Persist.Class.DeleteCascade import Database.Persist.Class.PersistConfig import Database.Persist.Class.PersistEntity import Database.Persist.Class.PersistField import Database.Persist.Class.PersistQuery import Database.Persist.Class.PersistStore import Database.Persist.Class.PersistUnique -- | A backwards-compatible alias for those that don't care about distinguishing between read and write queries. -- It signifies the assumption that, by default, a backend can write as well as read. type PersistUnique a = PersistUniqueWrite a -- | A backwards-compatible alias for those that don't care about distinguishing between read and write queries. -- It signifies the assumption that, by default, a backend can write as well as read. type PersistQuery a = PersistQueryWrite a -- | A backwards-compatible alias for those that don't care about distinguishing between read and write queries. -- It signifies the assumption that, by default, a backend can write as well as read. type PersistStore a = PersistStoreWrite a persistent-2.10.5.2/Database/Persist/Sql.hs0000644000000000000000000000540313614061357016565 0ustar0000000000000000module Database.Persist.Sql ( module Database.Persist.Sql.Types , module Database.Persist.Sql.Class , module Database.Persist.Sql.Run , module Database.Persist.Sql.Migration , module Database.Persist , module Database.Persist.Sql.Orphan.PersistStore , rawQuery , rawQueryRes , rawExecute , rawExecuteCount , rawSql , deleteWhereCount , updateWhereCount , transactionSave , transactionSaveWithIsolation , transactionUndo , transactionUndoWithIsolation , IsolationLevel (..) , getStmtConn -- * Internal , module Database.Persist.Sql.Internal , decorateSQLWithLimitOffset ) where import Control.Monad.IO.Class import Control.Monad.Trans.Reader (ReaderT, ask) import Database.Persist import Database.Persist.Sql.Types import Database.Persist.Sql.Types.Internal (IsolationLevel (..)) import Database.Persist.Sql.Class import Database.Persist.Sql.Run hiding (withResourceTimeout, rawAcquireSqlConn) import Database.Persist.Sql.Raw import Database.Persist.Sql.Migration import Database.Persist.Sql.Internal import Database.Persist.Sql.Orphan.PersistQuery import Database.Persist.Sql.Orphan.PersistStore import Database.Persist.Sql.Orphan.PersistUnique () -- | Commit the current transaction and begin a new one. -- This is used when a transaction commit is required within the context of 'runSqlConn' -- (which brackets its provided action with a transaction begin/commit pair). -- -- @since 1.2.0 transactionSave :: MonadIO m => ReaderT SqlBackend m () transactionSave = do conn <- ask let getter = getStmtConn conn liftIO $ connCommit conn getter >> connBegin conn getter Nothing -- | Commit the current transaction and begin a new one with the specified isolation level. -- -- @since 2.9.0 transactionSaveWithIsolation :: MonadIO m => IsolationLevel -> ReaderT SqlBackend m () transactionSaveWithIsolation isolation = do conn <- ask let getter = getStmtConn conn liftIO $ connCommit conn getter >> connBegin conn getter (Just isolation) -- | Roll back the current transaction and begin a new one. -- This rolls back to the state of the last call to 'transactionSave' or the enclosing -- 'runSqlConn' call. -- -- @since 1.2.0 transactionUndo :: MonadIO m => ReaderT SqlBackend m () transactionUndo = do conn <- ask let getter = getStmtConn conn liftIO $ connRollback conn getter >> connBegin conn getter Nothing -- | Roll back the current transaction and begin a new one with the specified isolation level. -- -- @since 2.9.0 transactionUndoWithIsolation :: MonadIO m => IsolationLevel -> ReaderT SqlBackend m () transactionUndoWithIsolation isolation = do conn <- ask let getter = getStmtConn conn liftIO $ connRollback conn getter >> connBegin conn getter (Just isolation) persistent-2.10.5.2/Database/Persist/Sql/Util.hs0000644000000000000000000001105013464141060017466 0ustar0000000000000000module Database.Persist.Sql.Util ( parseEntityValues , entityColumnNames , keyAndEntityColumnNames , entityColumnCount , isIdField , hasCompositeKey , dbIdColumns , dbIdColumnsEsc , dbColumns , updateFieldDef , updatePersistValue , mkUpdateText , mkUpdateText' , commaSeparated , parenWrapped ) where import Data.Maybe (isJust) import Data.Monoid ((<>)) import Data.Text (Text, pack) import qualified Data.Text as T import Database.Persist ( Entity(Entity), EntityDef, EntityField, HaskellName(HaskellName) , PersistEntity, PersistValue , keyFromValues, fromPersistValues, fieldDB, entityId, entityPrimary , entityFields, entityKeyFields, fieldHaskell, compositeFields, persistFieldDef , keyAndEntityFields, toPersistValue, DBName, Update(..), PersistUpdate(..) , FieldDef ) import Database.Persist.Sql.Types (Sql, SqlBackend, connEscapeName) entityColumnNames :: EntityDef -> SqlBackend -> [Sql] entityColumnNames ent conn = (if hasCompositeKey ent then [] else [connEscapeName conn $ fieldDB (entityId ent)]) <> map (connEscapeName conn . fieldDB) (entityFields ent) keyAndEntityColumnNames :: EntityDef -> SqlBackend -> [Sql] keyAndEntityColumnNames ent conn = map (connEscapeName conn . fieldDB) (keyAndEntityFields ent) entityColumnCount :: EntityDef -> Int entityColumnCount e = length (entityFields e) + if hasCompositeKey e then 0 else 1 hasCompositeKey :: EntityDef -> Bool hasCompositeKey = isJust . entityPrimary dbIdColumns :: SqlBackend -> EntityDef -> [Text] dbIdColumns conn = dbIdColumnsEsc (connEscapeName conn) dbIdColumnsEsc :: (DBName -> Text) -> EntityDef -> [Text] dbIdColumnsEsc esc t = map (esc . fieldDB) $ entityKeyFields t dbColumns :: SqlBackend -> EntityDef -> [Text] dbColumns conn t = case entityPrimary t of Just _ -> flds Nothing -> escapeDB (entityId t) : flds where escapeDB = connEscapeName conn . fieldDB flds = map escapeDB (entityFields t) parseEntityValues :: PersistEntity record => EntityDef -> [PersistValue] -> Either Text (Entity record) parseEntityValues t vals = case entityPrimary t of Just pdef -> let pks = map fieldHaskell $ compositeFields pdef keyvals = map snd . filter ((`elem` pks) . fst) $ zip (map fieldHaskell $ entityFields t) vals in fromPersistValuesComposite' keyvals vals Nothing -> fromPersistValues' vals where fromPersistValues' (kpv:xs) = -- oracle returns Double case fromPersistValues xs of Left e -> Left e Right xs' -> case keyFromValues [kpv] of Left _ -> error $ "fromPersistValues': keyFromValues failed on " ++ show kpv Right k -> Right (Entity k xs') fromPersistValues' xs = Left $ pack ("error in fromPersistValues' xs=" ++ show xs) fromPersistValuesComposite' keyvals xs = case fromPersistValues xs of Left e -> Left e Right xs' -> case keyFromValues keyvals of Left _ -> error "fromPersistValuesComposite': keyFromValues failed" Right key -> Right (Entity key xs') isIdField :: PersistEntity record => EntityField record typ -> Bool isIdField f = fieldHaskell (persistFieldDef f) == HaskellName "Id" -- | Gets the 'FieldDef' for an 'Update'. updateFieldDef :: PersistEntity v => Update v -> FieldDef updateFieldDef (Update f _ _) = persistFieldDef f updateFieldDef BackendUpdate {} = error "updateFieldDef: did not expect BackendUpdate" updatePersistValue :: Update v -> PersistValue updatePersistValue (Update _ v _) = toPersistValue v updatePersistValue (BackendUpdate{}) = error "updatePersistValue: did not expect BackendUpdate" commaSeparated :: [Text] -> Text commaSeparated = T.intercalate ", " mkUpdateText :: PersistEntity record => SqlBackend -> Update record -> Text mkUpdateText conn = mkUpdateText' (connEscapeName conn) id mkUpdateText' :: PersistEntity record => (DBName -> Text) -> (Text -> Text) -> Update record -> Text mkUpdateText' escapeName refColumn x = case updateUpdate x of Assign -> n <> "=?" Add -> T.concat [n, "=", refColumn n, "+?"] Subtract -> T.concat [n, "=", refColumn n, "-?"] Multiply -> T.concat [n, "=", refColumn n, "*?"] Divide -> T.concat [n, "=", refColumn n, "/?"] BackendSpecificUpdate up -> error . T.unpack $ "mkUpdateText: BackendSpecificUpdate " <> up <> " not supported" where n = escapeName . fieldDB . updateFieldDef $ x parenWrapped :: Text -> Text parenWrapped t = T.concat ["(", t, ")"]persistent-2.10.5.2/Database/Persist/Sql/Types/Internal.hs0000644000000000000000000002011613464141060021434 0ustar0000000000000000{-# LANGUAGE ConstraintKinds #-} {-# LANGUAGE DeriveDataTypeable #-} {-# LANGUAGE RankNTypes #-} module Database.Persist.Sql.Types.Internal ( HasPersistBackend (..) , IsPersistBackend (..) , SqlReadBackend (..) , SqlWriteBackend (..) , readToUnknown , readToWrite , writeToUnknown , LogFunc , InsertSqlResult (..) , Statement (..) , IsolationLevel (..) , makeIsolationLevelStatement , SqlBackend (..) , SqlBackendCanRead , SqlBackendCanWrite , SqlReadT , SqlWriteT , IsSqlBackend ) where import Data.List.NonEmpty (NonEmpty(..)) import Control.Monad.IO.Class (MonadIO (..)) import Control.Monad.Logger (LogSource, LogLevel) import Control.Monad.Trans.Class (lift) import Control.Monad.Trans.Reader (ReaderT, runReaderT, ask) import Data.Acquire (Acquire) import Data.Conduit (ConduitM) import Data.Int (Int64) import Data.IORef (IORef) import Data.Map (Map) import Data.Monoid ((<>)) import Data.String (IsString) import Data.Text (Text) import Data.Typeable (Typeable) import Language.Haskell.TH.Syntax (Loc) import System.Log.FastLogger (LogStr) import Database.Persist.Class ( HasPersistBackend (..) , PersistQueryRead, PersistQueryWrite , PersistStoreRead, PersistStoreWrite , PersistUniqueRead, PersistUniqueWrite , BackendCompatible(..) ) import Database.Persist.Class.PersistStore (IsPersistBackend (..)) import Database.Persist.Types type LogFunc = Loc -> LogSource -> LogLevel -> LogStr -> IO () data InsertSqlResult = ISRSingle Text | ISRInsertGet Text Text | ISRManyKeys Text [PersistValue] data Statement = Statement { stmtFinalize :: IO () , stmtReset :: IO () , stmtExecute :: [PersistValue] -> IO Int64 , stmtQuery :: forall m. MonadIO m => [PersistValue] -> Acquire (ConduitM () [PersistValue] m ()) } -- | Please refer to the documentation for the database in question for a full -- overview of the semantics of the varying isloation levels data IsolationLevel = ReadUncommitted | ReadCommitted | RepeatableRead | Serializable deriving (Show, Eq, Enum, Ord, Bounded) makeIsolationLevelStatement :: (Monoid s, IsString s) => IsolationLevel -> s makeIsolationLevelStatement l = "SET TRANSACTION ISOLATION LEVEL " <> case l of ReadUncommitted -> "READ UNCOMMITTED" ReadCommitted -> "READ COMMITTED" RepeatableRead -> "REPEATABLE READ" Serializable -> "SERIALIZABLE" data SqlBackend = SqlBackend { connPrepare :: Text -> IO Statement -- | table name, column names, id name, either 1 or 2 statements to run , connInsertSql :: EntityDef -> [PersistValue] -> InsertSqlResult , connInsertManySql :: Maybe (EntityDef -> [[PersistValue]] -> InsertSqlResult) -- ^ SQL for inserting many rows and returning their primary keys, for -- backends that support this functioanlity. If 'Nothing', rows will be -- inserted one-at-a-time using 'connInsertSql'. , connUpsertSql :: Maybe (EntityDef -> NonEmpty UniqueDef -> Text -> Text) -- ^ Some databases support performing UPSERT _and_ RETURN entity -- in a single call. -- -- This field when set will be used to generate the UPSERT+RETURN sql given -- * an entity definition -- * updates to be run on unique key(s) collision -- -- When left as 'Nothing', we find the unique key from entity def before -- * trying to fetch an entity by said key -- * perform an update when result found, else issue an insert -- * return new entity from db -- -- @since 2.6 , connPutManySql :: Maybe (EntityDef -> Int -> Text) -- ^ Some databases support performing bulk UPSERT, specifically -- "insert or replace many records" in a single call. -- -- This field when set, given -- * an entity definition -- * number of records to be inserted -- should produce a PUT MANY sql with placeholders for records -- -- When left as 'Nothing', we default to using 'defaultPutMany'. -- -- @since 2.8.1 , connStmtMap :: IORef (Map Text Statement) , connClose :: IO () , connMigrateSql :: [EntityDef] -> (Text -> IO Statement) -> EntityDef -> IO (Either [Text] [(Bool, Text)]) , connBegin :: (Text -> IO Statement) -> Maybe IsolationLevel -> IO () , connCommit :: (Text -> IO Statement) -> IO () , connRollback :: (Text -> IO Statement) -> IO () , connEscapeName :: DBName -> Text , connNoLimit :: Text , connRDBMS :: Text , connLimitOffset :: (Int,Int) -> Bool -> Text -> Text , connLogFunc :: LogFunc , connMaxParams :: Maybe Int -- ^ Some databases (probably only Sqlite) have a limit on how -- many question-mark parameters may be used in a statement -- -- @since 2.6.1 , connRepsertManySql :: Maybe (EntityDef -> Int -> Text) -- ^ Some databases support performing bulk an atomic+bulk INSERT where -- constraint conflicting entities can replace existing entities. -- -- This field when set, given -- * an entity definition -- * number of records to be inserted -- should produce a INSERT sql with placeholders for primary+record fields -- -- When left as 'Nothing', we default to using 'defaultRepsertMany'. -- -- @since 2.9.0 } deriving Typeable instance HasPersistBackend SqlBackend where type BaseBackend SqlBackend = SqlBackend persistBackend = id instance IsPersistBackend SqlBackend where mkPersistBackend = id -- | An SQL backend which can only handle read queries -- -- The constructor was exposed in 2.10.0. newtype SqlReadBackend = SqlReadBackend { unSqlReadBackend :: SqlBackend } deriving Typeable instance HasPersistBackend SqlReadBackend where type BaseBackend SqlReadBackend = SqlBackend persistBackend = unSqlReadBackend instance IsPersistBackend SqlReadBackend where mkPersistBackend = SqlReadBackend -- | An SQL backend which can handle read or write queries -- -- The constructor was exposed in 2.10.0 newtype SqlWriteBackend = SqlWriteBackend { unSqlWriteBackend :: SqlBackend } deriving Typeable instance HasPersistBackend SqlWriteBackend where type BaseBackend SqlWriteBackend = SqlBackend persistBackend = unSqlWriteBackend instance IsPersistBackend SqlWriteBackend where mkPersistBackend = SqlWriteBackend -- | Useful for running a write query against an untagged backend with unknown capabilities. writeToUnknown :: Monad m => ReaderT SqlWriteBackend m a -> ReaderT SqlBackend m a writeToUnknown ma = do unknown <- ask lift . runReaderT ma $ SqlWriteBackend unknown -- | Useful for running a read query against a backend with read and write capabilities. readToWrite :: Monad m => ReaderT SqlReadBackend m a -> ReaderT SqlWriteBackend m a readToWrite ma = do write <- ask lift . runReaderT ma . SqlReadBackend $ unSqlWriteBackend write -- | Useful for running a read query against a backend with unknown capabilities. readToUnknown :: Monad m => ReaderT SqlReadBackend m a -> ReaderT SqlBackend m a readToUnknown ma = do unknown <- ask lift . runReaderT ma $ SqlReadBackend unknown -- | A constraint synonym which witnesses that a backend is SQL and can run read queries. type SqlBackendCanRead backend = ( BackendCompatible SqlBackend backend , PersistQueryRead backend, PersistStoreRead backend, PersistUniqueRead backend ) -- | A constraint synonym which witnesses that a backend is SQL and can run read and write queries. type SqlBackendCanWrite backend = ( SqlBackendCanRead backend , PersistQueryWrite backend, PersistStoreWrite backend, PersistUniqueWrite backend ) -- | Like @SqlPersistT@ but compatible with any SQL backend which can handle read queries. type SqlReadT m a = forall backend. (SqlBackendCanRead backend) => ReaderT backend m a -- | Like @SqlPersistT@ but compatible with any SQL backend which can handle read and write queries. type SqlWriteT m a = forall backend. (SqlBackendCanWrite backend) => ReaderT backend m a -- | A backend which is a wrapper around @SqlBackend@. type IsSqlBackend backend = (IsPersistBackend backend, BaseBackend backend ~ SqlBackend) persistent-2.10.5.2/Database/Persist/Types/Base.hs0000644000000000000000000004771113555610075020015 0ustar0000000000000000{-# LANGUAGE DeriveDataTypeable #-} {-# OPTIONS_GHC -fno-warn-deprecations #-} -- usage of Error typeclass module Database.Persist.Types.Base where import Control.Arrow (second) import Control.Exception (Exception) import Control.Monad.Trans.Error (Error (..)) import qualified Data.Aeson as A import Data.Bits (shiftL, shiftR) import Data.ByteString (ByteString, foldl') import qualified Data.ByteString as BS import qualified Data.ByteString.Base64 as B64 import qualified Data.ByteString.Char8 as BS8 import qualified Data.HashMap.Strict as HM import Data.Int (Int64) import Data.Map (Map) import qualified Data.Scientific import Data.Text (Text, pack) import qualified Data.Text as T import qualified Data.Text.Encoding as TE import Data.Text.Encoding.Error (lenientDecode) import Data.Time (Day, TimeOfDay, UTCTime) import Data.Typeable (Typeable) import qualified Data.Vector as V import Data.Word (Word32) import Numeric (showHex, readHex) import Web.PathPieces (PathPiece(..)) import Web.HttpApiData (ToHttpApiData (..), FromHttpApiData (..), parseUrlPieceMaybe, showTextData, readTextData, parseBoundedTextData) -- | A 'Checkmark' should be used as a field type whenever a -- uniqueness constraint should guarantee that a certain kind of -- record may appear at most once, but other kinds of records may -- appear any number of times. -- -- /NOTE:/ You need to mark any @Checkmark@ fields as @nullable@ -- (see the following example). -- -- For example, suppose there's a @Location@ entity that -- represents where a user has lived: -- -- @ -- Location -- user UserId -- name Text -- current Checkmark nullable -- -- UniqueLocation user current -- @ -- -- The @UniqueLocation@ constraint allows any number of -- 'Inactive' @Location@s to be @current@. However, there may be -- at most one @current@ @Location@ per user (i.e., either zero -- or one per user). -- -- This data type works because of the way that SQL treats -- @NULL@able fields within uniqueness constraints. The SQL -- standard says that @NULL@ values should be considered -- different, so we represent 'Inactive' as SQL @NULL@, thus -- allowing any number of 'Inactive' records. On the other hand, -- we represent 'Active' as @TRUE@, so the uniqueness constraint -- will disallow more than one 'Active' record. -- -- /Note:/ There may be DBMSs that do not respect the SQL -- standard's treatment of @NULL@ values on uniqueness -- constraints, please check if this data type works before -- relying on it. -- -- The SQL @BOOLEAN@ type is used because it's the smallest data -- type available. Note that we never use @FALSE@, just @TRUE@ -- and @NULL@. Provides the same behavior @Maybe ()@ would if -- @()@ was a valid 'PersistField'. data Checkmark = Active -- ^ When used on a uniqueness constraint, there -- may be at most one 'Active' record. | Inactive -- ^ When used on a uniqueness constraint, there -- may be any number of 'Inactive' records. deriving (Eq, Ord, Read, Show, Enum, Bounded) instance ToHttpApiData Checkmark where toUrlPiece = showTextData instance FromHttpApiData Checkmark where parseUrlPiece = parseBoundedTextData instance PathPiece Checkmark where toPathPiece Active = "active" toPathPiece Inactive = "inactive" fromPathPiece "active" = Just Active fromPathPiece "inactive" = Just Inactive fromPathPiece _ = Nothing data IsNullable = Nullable !WhyNullable | NotNullable deriving (Eq, Show) -- | The reason why a field is 'nullable' is very important. A -- field that is nullable because of a @Maybe@ tag will have its -- type changed from @A@ to @Maybe A@. OTOH, a field that is -- nullable because of a @nullable@ tag will remain with the same -- type. data WhyNullable = ByMaybeAttr | ByNullableAttr deriving (Eq, Show) -- | An 'EntityDef' represents the information that @persistent@ knows -- about an Entity. It uses this information to generate the Haskell -- datatype, the SQL migrations, and other relevant conversions. data EntityDef = EntityDef { entityHaskell :: !HaskellName -- ^ The name of the entity as Haskell understands it. , entityDB :: !DBName -- ^ The name of the database table corresponding to the entity. , entityId :: !FieldDef -- ^ The entity's primary key or identifier. , entityAttrs :: ![Attr] -- ^ The @persistent@ entity syntax allows you to add arbitrary 'Attr's -- to an entity using the @!@ operator. Those attributes are stored in -- this list. , entityFields :: ![FieldDef] -- ^ The fields for this entity. Note that the ID field will not be -- present in this list. To get all of the fields for an entity, use -- 'keyAndEntityFields'. , entityUniques :: ![UniqueDef] -- ^ The Uniqueness constraints for this entity. , entityForeigns:: ![ForeignDef] -- ^ The foreign key relationships that this entity has to other -- entities. , entityDerives :: ![Text] -- ^ A list of type classes that have been derived for this entity. , entityExtra :: !(Map Text [ExtraLine]) , entitySum :: !Bool -- ^ Whether or not this entity represents a sum type in the database. , entityComments :: !(Maybe Text) -- ^ Optional comments on the entity. -- -- @since 2.10.0 } deriving (Show, Eq, Read, Ord) entityPrimary :: EntityDef -> Maybe CompositeDef entityPrimary t = case fieldReference (entityId t) of CompositeRef c -> Just c _ -> Nothing entityKeyFields :: EntityDef -> [FieldDef] entityKeyFields ent = case entityPrimary ent of Nothing -> [entityId ent] Just pdef -> compositeFields pdef keyAndEntityFields :: EntityDef -> [FieldDef] keyAndEntityFields ent = case entityPrimary ent of Nothing -> entityId ent : entityFields ent Just _ -> entityFields ent type ExtraLine = [Text] newtype HaskellName = HaskellName { unHaskellName :: Text } deriving (Show, Eq, Read, Ord) newtype DBName = DBName { unDBName :: Text } deriving (Show, Eq, Read, Ord) type Attr = Text data FieldType = FTTypeCon (Maybe Text) Text -- ^ Optional module and name. | FTApp FieldType FieldType | FTList FieldType deriving (Show, Eq, Read, Ord) -- | A 'FieldDef' represents the inormation that @persistent@ knows about -- a field of a datatype. This includes information used to parse the field -- out of the database and what the field corresponds to. data FieldDef = FieldDef { fieldHaskell :: !HaskellName -- ^ The name of the field. Note that this does not corresponds to the -- record labels generated for the particular entity - record labels -- are generated with the type name prefixed to the field, so -- a 'FieldDef' that contains a @'HaskellName' "name"@ for a type -- @User@ will have a record field @userName@. , fieldDB :: !DBName -- ^ The name of the field in the database. For SQL databases, this -- corresponds to the column name. , fieldType :: !FieldType -- ^ The type of the field in Haskell. , fieldSqlType :: !SqlType -- ^ The type of the field in a SQL database. , fieldAttrs :: ![Attr] -- ^ User annotations for a field. These are provided with the @!@ -- operator. , fieldStrict :: !Bool -- ^ If this is 'True', then the Haskell datatype will have a strict -- record field. The default value for this is 'True'. , fieldReference :: !ReferenceDef , fieldComments :: !(Maybe Text) -- ^ Optional comments for a 'Field'. There is not currently a way to -- attach comments to a field in the quasiquoter. -- -- @since 2.10.0 } deriving (Show, Eq, Read, Ord) -- | There are 3 kinds of references -- 1) composite (to fields that exist in the record) -- 2) single field -- 3) embedded data ReferenceDef = NoReference | ForeignRef !HaskellName !FieldType -- ^ A ForeignRef has a late binding to the EntityDef it references via HaskellName and has the Haskell type of the foreign key in the form of FieldType | EmbedRef EmbedEntityDef | CompositeRef CompositeDef | SelfReference -- ^ A SelfReference stops an immediate cycle which causes non-termination at compile-time (issue #311). deriving (Show, Eq, Read, Ord) -- | An EmbedEntityDef is the same as an EntityDef -- But it is only used for fieldReference -- so it only has data needed for embedding data EmbedEntityDef = EmbedEntityDef { embeddedHaskell :: !HaskellName , embeddedFields :: ![EmbedFieldDef] } deriving (Show, Eq, Read, Ord) -- | An EmbedFieldDef is the same as a FieldDef -- But it is only used for embeddedFields -- so it only has data needed for embedding data EmbedFieldDef = EmbedFieldDef { emFieldDB :: !DBName , emFieldEmbed :: Maybe EmbedEntityDef , emFieldCycle :: Maybe HaskellName -- ^ 'emFieldEmbed' can create a cycle (issue #311) -- when a cycle is detected, 'emFieldEmbed' will be Nothing -- and 'emFieldCycle' will be Just } deriving (Show, Eq, Read, Ord) toEmbedEntityDef :: EntityDef -> EmbedEntityDef toEmbedEntityDef ent = embDef where embDef = EmbedEntityDef { embeddedHaskell = entityHaskell ent , embeddedFields = map toEmbedFieldDef $ entityFields ent } toEmbedFieldDef :: FieldDef -> EmbedFieldDef toEmbedFieldDef field = EmbedFieldDef { emFieldDB = fieldDB field , emFieldEmbed = case fieldReference field of EmbedRef em -> Just em SelfReference -> Just embDef _ -> Nothing , emFieldCycle = case fieldReference field of SelfReference -> Just $ entityHaskell ent _ -> Nothing } -- Type for storing the Uniqueness constraint in the Schema. -- Assume you have the following schema with a uniqueness -- constraint: -- Person -- name String -- age Int -- UniqueAge age -- -- This will be represented as: -- UniqueDef (HaskellName (packPTH "UniqueAge")) -- (DBName (packPTH "unique_age")) [(HaskellName (packPTH "age"), DBName (packPTH "age"))] [] -- data UniqueDef = UniqueDef { uniqueHaskell :: !HaskellName , uniqueDBName :: !DBName , uniqueFields :: ![(HaskellName, DBName)] , uniqueAttrs :: ![Attr] } deriving (Show, Eq, Read, Ord) data CompositeDef = CompositeDef { compositeFields :: ![FieldDef] , compositeAttrs :: ![Attr] } deriving (Show, Eq, Read, Ord) -- | Used instead of FieldDef -- to generate a smaller amount of code type ForeignFieldDef = (HaskellName, DBName) data ForeignDef = ForeignDef { foreignRefTableHaskell :: !HaskellName , foreignRefTableDBName :: !DBName , foreignConstraintNameHaskell :: !HaskellName , foreignConstraintNameDBName :: !DBName , foreignFields :: ![(ForeignFieldDef, ForeignFieldDef)] -- this entity plus the primary entity , foreignAttrs :: ![Attr] , foreignNullable :: Bool } deriving (Show, Eq, Read, Ord) data PersistException = PersistError Text -- ^ Generic Exception | PersistMarshalError Text | PersistInvalidField Text | PersistForeignConstraintUnmet Text | PersistMongoDBError Text | PersistMongoDBUnsupported Text deriving (Show, Typeable) instance Exception PersistException instance Error PersistException where strMsg = PersistError . pack -- | A raw value which can be stored in any backend and can be marshalled to -- and from a 'PersistField'. data PersistValue = PersistText Text | PersistByteString ByteString | PersistInt64 Int64 | PersistDouble Double | PersistRational Rational | PersistBool Bool | PersistDay Day | PersistTimeOfDay TimeOfDay | PersistUTCTime UTCTime | PersistNull | PersistList [PersistValue] | PersistMap [(Text, PersistValue)] | PersistObjectId ByteString -- ^ Intended especially for MongoDB backend | PersistArray [PersistValue] -- ^ Intended especially for PostgreSQL backend for text arrays | PersistDbSpecific ByteString -- ^ Using 'PersistDbSpecific' allows you to use types specific to a particular backend -- For example, below is a simple example of the PostGIS geography type: -- -- @ -- data Geo = Geo ByteString -- -- instance PersistField Geo where -- toPersistValue (Geo t) = PersistDbSpecific t -- -- fromPersistValue (PersistDbSpecific t) = Right $ Geo $ Data.ByteString.concat ["'", t, "'"] -- fromPersistValue _ = Left "Geo values must be converted from PersistDbSpecific" -- -- instance PersistFieldSql Geo where -- sqlType _ = SqlOther "GEOGRAPHY(POINT,4326)" -- -- toPoint :: Double -> Double -> Geo -- toPoint lat lon = Geo $ Data.ByteString.concat ["'POINT(", ps $ lon, " ", ps $ lat, ")'"] -- where ps = Data.Text.pack . show -- @ -- -- If Foo has a geography field, we can then perform insertions like the following: -- -- @ -- insert $ Foo (toPoint 44 44) -- @ -- deriving (Show, Read, Eq, Typeable, Ord) instance ToHttpApiData PersistValue where toUrlPiece val = case fromPersistValueText val of Left e -> error $ T.unpack e Right y -> y instance FromHttpApiData PersistValue where parseUrlPiece input = PersistInt64 <$> parseUrlPiece input PersistList <$> readTextData input PersistText <$> return input where infixl 3 Left _ y = y x _ = x instance PathPiece PersistValue where toPathPiece = toUrlPiece fromPathPiece = parseUrlPieceMaybe fromPersistValueText :: PersistValue -> Either Text Text fromPersistValueText (PersistText s) = Right s fromPersistValueText (PersistByteString bs) = Right $ TE.decodeUtf8With lenientDecode bs fromPersistValueText (PersistInt64 i) = Right $ T.pack $ show i fromPersistValueText (PersistDouble d) = Right $ T.pack $ show d fromPersistValueText (PersistRational r) = Right $ T.pack $ show r fromPersistValueText (PersistDay d) = Right $ T.pack $ show d fromPersistValueText (PersistTimeOfDay d) = Right $ T.pack $ show d fromPersistValueText (PersistUTCTime d) = Right $ T.pack $ show d fromPersistValueText PersistNull = Left "Unexpected null" fromPersistValueText (PersistBool b) = Right $ T.pack $ show b fromPersistValueText (PersistList _) = Left "Cannot convert PersistList to Text" fromPersistValueText (PersistMap _) = Left "Cannot convert PersistMap to Text" fromPersistValueText (PersistObjectId _) = Left "Cannot convert PersistObjectId to Text" fromPersistValueText (PersistArray _) = Left "Cannot convert PersistArray to Text" fromPersistValueText (PersistDbSpecific _) = Left "Cannot convert PersistDbSpecific to Text. See the documentation of PersistDbSpecific for an example of using a custom database type with Persistent." instance A.ToJSON PersistValue where toJSON (PersistText t) = A.String $ T.cons 's' t toJSON (PersistByteString b) = A.String $ T.cons 'b' $ TE.decodeUtf8 $ B64.encode b toJSON (PersistInt64 i) = A.Number $ fromIntegral i toJSON (PersistDouble d) = A.Number $ Data.Scientific.fromFloatDigits d toJSON (PersistRational r) = A.String $ T.pack $ 'r' : show r toJSON (PersistBool b) = A.Bool b toJSON (PersistTimeOfDay t) = A.String $ T.pack $ 't' : show t toJSON (PersistUTCTime u) = A.String $ T.pack $ 'u' : show u toJSON (PersistDay d) = A.String $ T.pack $ 'd' : show d toJSON PersistNull = A.Null toJSON (PersistList l) = A.Array $ V.fromList $ map A.toJSON l toJSON (PersistMap m) = A.object $ map (second A.toJSON) m toJSON (PersistDbSpecific b) = A.String $ T.cons 'p' $ TE.decodeUtf8 $ B64.encode b toJSON (PersistArray a) = A.Array $ V.fromList $ map A.toJSON a toJSON (PersistObjectId o) = A.toJSON $ showChar 'o' $ showHexLen 8 (bs2i four) $ showHexLen 16 (bs2i eight) "" where (four, eight) = BS8.splitAt 4 o -- taken from crypto-api bs2i :: ByteString -> Integer bs2i bs = foldl' (\i b -> (i `shiftL` 8) + fromIntegral b) 0 bs {-# INLINE bs2i #-} -- showHex of n padded with leading zeros if necessary to fill d digits -- taken from Data.BSON showHexLen :: (Show n, Integral n) => Int -> n -> ShowS showHexLen d n = showString (replicate (d - sigDigits n) '0') . showHex n where sigDigits 0 = 1 sigDigits n' = truncate (logBase (16 :: Double) $ fromIntegral n') + 1 instance A.FromJSON PersistValue where parseJSON (A.String t0) = case T.uncons t0 of Nothing -> fail "Null string" Just ('p', t) -> either (\_ -> fail "Invalid base64") (return . PersistDbSpecific) $ B64.decode $ TE.encodeUtf8 t Just ('s', t) -> return $ PersistText t Just ('b', t) -> either (\_ -> fail "Invalid base64") (return . PersistByteString) $ B64.decode $ TE.encodeUtf8 t Just ('t', t) -> fmap PersistTimeOfDay $ readMay t Just ('u', t) -> fmap PersistUTCTime $ readMay t Just ('d', t) -> fmap PersistDay $ readMay t Just ('r', t) -> fmap PersistRational $ readMay t Just ('o', t) -> maybe (fail "Invalid base64") (return . PersistObjectId) $ fmap (i2bs (8 * 12) . fst) $ headMay $ readHex $ T.unpack t Just (c, _) -> fail $ "Unknown prefix: " ++ [c] where headMay [] = Nothing headMay (x:_) = Just x readMay t = case reads $ T.unpack t of (x, _):_ -> return x [] -> fail "Could not read" -- taken from crypto-api -- |@i2bs bitLen i@ converts @i@ to a 'ByteString' of @bitLen@ bits (must be a multiple of 8). i2bs :: Int -> Integer -> BS.ByteString i2bs l i = BS.unfoldr (\l' -> if l' < 0 then Nothing else Just (fromIntegral (i `shiftR` l'), l' - 8)) (l-8) {-# INLINE i2bs #-} parseJSON (A.Number n) = return $ if fromInteger (floor n) == n then PersistInt64 $ floor n else PersistDouble $ fromRational $ toRational n parseJSON (A.Bool b) = return $ PersistBool b parseJSON A.Null = return $ PersistNull parseJSON (A.Array a) = fmap PersistList (mapM A.parseJSON $ V.toList a) parseJSON (A.Object o) = fmap PersistMap $ mapM go $ HM.toList o where go (k, v) = fmap ((,) k) $ A.parseJSON v -- | A SQL data type. Naming attempts to reflect the underlying Haskell -- datatypes, eg SqlString instead of SqlVarchar. Different SQL databases may -- have different translations for these types. data SqlType = SqlString | SqlInt32 | SqlInt64 | SqlReal | SqlNumeric Word32 Word32 | SqlBool | SqlDay | SqlTime | SqlDayTime -- ^ Always uses UTC timezone | SqlBlob | SqlOther T.Text -- ^ a backend-specific name deriving (Show, Read, Eq, Typeable, Ord) data PersistFilter = Eq | Ne | Gt | Lt | Ge | Le | In | NotIn | BackendSpecificFilter T.Text deriving (Read, Show) data UpdateException = KeyNotFound String | UpsertError String deriving Typeable instance Show UpdateException where show (KeyNotFound key) = "Key not found during updateGet: " ++ key show (UpsertError msg) = "Error during upsert: " ++ msg instance Exception UpdateException data OnlyUniqueException = OnlyUniqueException String deriving Typeable instance Show OnlyUniqueException where show (OnlyUniqueException uniqueMsg) = "Expected only one unique key, got " ++ uniqueMsg instance Exception OnlyUniqueException data PersistUpdate = Assign | Add | Subtract | Multiply | Divide | BackendSpecificUpdate T.Text deriving (Read, Show) persistent-2.10.5.2/Database/Persist/Class/DeleteCascade.hs0000644000000000000000000000243313464141060021532 0ustar0000000000000000module Database.Persist.Class.DeleteCascade ( DeleteCascade (..) , deleteCascadeWhere ) where import Control.Monad.IO.Class (MonadIO, liftIO) import Control.Monad.Reader (ReaderT, ask, runReaderT) import Data.Conduit import qualified Data.Conduit.List as CL import Data.Acquire (with) import Database.Persist.Class.PersistStore import Database.Persist.Class.PersistQuery import Database.Persist.Class.PersistEntity -- | For combinations of backends and entities that support -- cascade-deletion. “Cascade-deletion” means that entries that depend on -- other entries to be deleted will be deleted as well. class (PersistStoreWrite backend, PersistEntity record, BaseBackend backend ~ PersistEntityBackend record) => DeleteCascade record backend where -- | Perform cascade-deletion of single database -- entry. deleteCascade :: MonadIO m => Key record -> ReaderT backend m () -- | Cascade-deletion of entries satisfying given filters. deleteCascadeWhere :: (MonadIO m, DeleteCascade record backend, PersistQueryWrite backend) => [Filter record] -> ReaderT backend m () deleteCascadeWhere filts = do srcRes <- selectKeysRes filts [] conn <- ask liftIO $ with srcRes (\src -> runConduit $ src .| CL.mapM_ (flip runReaderT conn . deleteCascade)) persistent-2.10.5.2/Database/Persist/Class/PersistEntity.hs0000644000000000000000000003560413621317264021726 0ustar0000000000000000{-# LANGUAGE DeriveGeneric #-} {-# LANGUAGE GADTs #-} {-# LANGUAGE RankNTypes #-} {-# LANGUAGE StandaloneDeriving #-} {-# LANGUAGE UndecidableInstances #-} module Database.Persist.Class.PersistEntity ( PersistEntity (..) , Update (..) , BackendSpecificUpdate , SelectOpt (..) , Filter (..) , FilterValue (..) , BackendSpecificFilter , Entity (..) , recordName , entityValues , keyValueEntityToJSON, keyValueEntityFromJSON , entityIdToJSON, entityIdFromJSON -- * PersistField based on other typeclasses , toPersistValueJSON, fromPersistValueJSON , toPersistValueEnum, fromPersistValueEnum ) where import Data.Aeson (ToJSON (..), FromJSON (..), fromJSON, object, (.:), (.=), Value (Object)) import qualified Data.Aeson.Parser as AP import Data.Aeson.Types (Parser,Result(Error,Success)) import Data.Aeson.Text (encodeToTextBuilder) import Data.Attoparsec.ByteString (parseOnly) import qualified Data.HashMap.Strict as HM import Data.Maybe (isJust) import Data.Monoid (mappend) import Data.Text (Text) import qualified Data.Text as T import qualified Data.Text.Encoding as TE import qualified Data.Text.Lazy as LT import qualified Data.Text.Lazy.Builder as TB import Data.Typeable (Typeable) import GHC.Generics import Database.Persist.Class.PersistField import Database.Persist.Types.Base -- | Persistent serialized Haskell records to the database. -- A Database 'Entity' (A row in SQL, a document in MongoDB, etc) -- corresponds to a 'Key' plus a Haskell record. -- -- For every Haskell record type stored in the database there is a -- corresponding 'PersistEntity' instance. An instance of PersistEntity -- contains meta-data for the record. PersistEntity also helps abstract -- over different record types. That way the same query interface can return -- a 'PersistEntity', with each query returning different types of Haskell -- records. -- -- Some advanced type system capabilities are used to make this process -- type-safe. Persistent users usually don't need to understand the class -- associated data and functions. class ( PersistField (Key record), ToJSON (Key record), FromJSON (Key record) , Show (Key record), Read (Key record), Eq (Key record), Ord (Key record)) => PersistEntity record where -- | Persistent allows multiple different backends (databases). type PersistEntityBackend record -- | By default, a backend will automatically generate the key -- Instead you can specify a Primary key made up of unique values. data Key record -- | A lower-level key operation. keyToValues :: Key record -> [PersistValue] -- | A lower-level key operation. keyFromValues :: [PersistValue] -> Either Text (Key record) -- | A meta-operation to retrieve the 'Key' 'EntityField'. persistIdField :: EntityField record (Key record) -- | Retrieve the 'EntityDef' meta-data for the record. entityDef :: Monad m => m record -> EntityDef -- | An 'EntityField' is parameterised by the Haskell record it belongs to -- and the additional type of that field. data EntityField record :: * -> * -- | Return meta-data for a given 'EntityField'. persistFieldDef :: EntityField record typ -> FieldDef -- | A meta-operation to get the database fields of a record. toPersistFields :: record -> [SomePersistField] -- | A lower-level operation to convert from database values to a Haskell record. fromPersistValues :: [PersistValue] -> Either Text record -- | Unique keys besides the 'Key'. data Unique record -- | A meta operation to retrieve all the 'Unique' keys. persistUniqueKeys :: record -> [Unique record] -- | A lower level operation. persistUniqueToFieldNames :: Unique record -> [(HaskellName, DBName)] -- | A lower level operation. persistUniqueToValues :: Unique record -> [PersistValue] -- | Use a 'PersistField' as a lens. fieldLens :: EntityField record field -> (forall f. Functor f => (field -> f field) -> Entity record -> f (Entity record)) type family BackendSpecificUpdate backend record -- Moved over from Database.Persist.Class.PersistUnique -- | Textual representation of the record recordName :: (PersistEntity record) => record -> Text recordName = unHaskellName . entityHaskell . entityDef . Just -- | Updating a database entity. -- -- Persistent users use combinators to create these. data Update record = forall typ. PersistField typ => Update { updateField :: EntityField record typ , updateValue :: typ -- FIXME Replace with expr down the road , updateUpdate :: PersistUpdate } | BackendUpdate (BackendSpecificUpdate (PersistEntityBackend record) record) -- | Query options. -- -- Persistent users use these directly. data SelectOpt record = forall typ. Asc (EntityField record typ) | forall typ. Desc (EntityField record typ) | OffsetBy Int | LimitTo Int type family BackendSpecificFilter backend record -- | Filters which are available for 'select', 'updateWhere' and -- 'deleteWhere'. Each filter constructor specifies the field being -- filtered on, the type of comparison applied (equals, not equals, etc) -- and the argument for the comparison. -- -- Persistent users use combinators to create these. -- -- Note that it's important to be careful about the 'PersistFilter' that -- you are using, if you use this directly. For example, using the 'In' -- 'PersistFilter' requires that you have an array- or list-shaped -- 'EntityField'. It is possible to construct values using this that will -- create malformed runtime values. data Filter record = forall typ. PersistField typ => Filter { filterField :: EntityField record typ , filterValue :: FilterValue typ , filterFilter :: PersistFilter -- FIXME } | FilterAnd [Filter record] -- ^ convenient for internal use, not needed for the API | FilterOr [Filter record] | BackendFilter (BackendSpecificFilter (PersistEntityBackend record) record) -- | Value to filter with. Highly dependant on the type of filter used. -- -- @since 2.10.0 data FilterValue typ where FilterValue :: typ -> FilterValue typ FilterValues :: [typ] -> FilterValue typ UnsafeValue :: forall a typ. PersistField a => a -> FilterValue typ -- | Datatype that represents an entity, with both its 'Key' and -- its Haskell record representation. -- -- When using a SQL-based backend (such as SQLite or -- PostgreSQL), an 'Entity' may take any number of columns -- depending on how many fields it has. In order to reconstruct -- your entity on the Haskell side, @persistent@ needs all of -- your entity columns and in the right order. Note that you -- don't need to worry about this when using @persistent@\'s API -- since everything is handled correctly behind the scenes. -- -- However, if you want to issue a raw SQL command that returns -- an 'Entity', then you have to be careful with the column -- order. While you could use @SELECT Entity.* WHERE ...@ and -- that would work most of the time, there are times when the -- order of the columns on your database is different from the -- order that @persistent@ expects (for example, if you add a new -- field in the middle of you entity definition and then use the -- migration code -- @persistent@ will expect the column to be in -- the middle, but your DBMS will put it as the last column). -- So, instead of using a query like the one above, you may use -- 'Database.Persist.GenericSql.rawSql' (from the -- "Database.Persist.GenericSql" module) with its /entity -- selection placeholder/ (a double question mark @??@). Using -- @rawSql@ the query above must be written as @SELECT ?? WHERE -- ..@. Then @rawSql@ will replace @??@ with the list of all -- columns that we need from your entity in the right order. If -- your query returns two entities (i.e. @(Entity backend a, -- Entity backend b)@), then you must you use @SELECT ??, ?? -- WHERE ...@, and so on. data Entity record = Entity { entityKey :: Key record , entityVal :: record } deriving Typeable deriving instance (Generic (Key record), Generic record) => Generic (Entity record) deriving instance (Eq (Key record), Eq record) => Eq (Entity record) deriving instance (Ord (Key record), Ord record) => Ord (Entity record) deriving instance (Show (Key record), Show record) => Show (Entity record) deriving instance (Read (Key record), Read record) => Read (Entity record) -- | Get list of values corresponding to given entity. entityValues :: PersistEntity record => Entity record -> [PersistValue] entityValues (Entity k record) = if isJust (entityPrimary ent) then -- TODO: check against the key map toPersistValue (toPersistFields record) else keyToValues k ++ map toPersistValue (toPersistFields record) where ent = entityDef $ Just record -- | Predefined @toJSON@. The resulting JSON looks like -- @{"key": 1, "value": {"name": ...}}@. -- -- The typical usage is: -- -- @ -- instance ToJSON (Entity User) where -- toJSON = keyValueEntityToJSON -- @ keyValueEntityToJSON :: (PersistEntity record, ToJSON record) => Entity record -> Value keyValueEntityToJSON (Entity key value) = object [ "key" .= key , "value" .= value ] -- | Predefined @parseJSON@. The input JSON looks like -- @{"key": 1, "value": {"name": ...}}@. -- -- The typical usage is: -- -- @ -- instance FromJSON (Entity User) where -- parseJSON = keyValueEntityFromJSON -- @ keyValueEntityFromJSON :: (PersistEntity record, FromJSON record) => Value -> Parser (Entity record) keyValueEntityFromJSON (Object o) = Entity <$> o .: "key" <*> o .: "value" keyValueEntityFromJSON _ = fail "keyValueEntityFromJSON: not an object" -- | Predefined @toJSON@. The resulting JSON looks like -- @{"id": 1, "name": ...}@. -- -- The typical usage is: -- -- @ -- instance ToJSON (Entity User) where -- toJSON = entityIdToJSON -- @ entityIdToJSON :: (PersistEntity record, ToJSON record) => Entity record -> Value entityIdToJSON (Entity key value) = case toJSON value of Object o -> Object $ HM.insert "id" (toJSON key) o x -> x -- | Predefined @parseJSON@. The input JSON looks like -- @{"id": 1, "name": ...}@. -- -- The typical usage is: -- -- @ -- instance FromJSON (Entity User) where -- parseJSON = entityIdFromJSON -- @ entityIdFromJSON :: (PersistEntity record, FromJSON record) => Value -> Parser (Entity record) entityIdFromJSON value@(Object o) = Entity <$> o .: "id" <*> parseJSON value entityIdFromJSON _ = fail "entityIdFromJSON: not an object" instance (PersistEntity record, PersistField record, PersistField (Key record)) => PersistField (Entity record) where toPersistValue (Entity key value) = case toPersistValue value of (PersistMap alist) -> PersistMap ((idField, toPersistValue key) : alist) _ -> error $ T.unpack $ errMsg "expected PersistMap" fromPersistValue (PersistMap alist) = case after of [] -> Left $ errMsg $ "did not find " `Data.Monoid.mappend` idField `mappend` " field" ("_id", kv):afterRest -> fromPersistValue (PersistMap (before ++ afterRest)) >>= \record -> keyFromValues [kv] >>= \k -> Right (Entity k record) _ -> Left $ errMsg $ "impossible id field: " `mappend` T.pack (show alist) where (before, after) = break ((== idField) . fst) alist fromPersistValue x = Left $ errMsg "Expected PersistMap, received: " `mappend` T.pack (show x) errMsg :: Text -> Text errMsg = mappend "PersistField entity fromPersistValue: " -- | Realistically this is only going to be used for MongoDB, -- so lets use MongoDB conventions idField :: Text idField = "_id" -- | Convenience function for getting a free 'PersistField' instance -- from a type with JSON instances. -- -- -- Example usage in combination with 'fromPersistValueJSON': -- -- @ -- instance PersistField MyData where -- fromPersistValue = fromPersistValueJSON -- toPersistValue = toPersistValueJSON -- @ toPersistValueJSON :: ToJSON a => a -> PersistValue toPersistValueJSON = PersistText . LT.toStrict . TB.toLazyText . encodeToTextBuilder . toJSON -- | Convenience function for getting a free 'PersistField' instance -- from a type with JSON instances. The JSON parser used will accept JSON -- values other that object and arrays. So, if your instance serializes the -- data to a JSON string, this will still work. -- -- -- Example usage in combination with 'toPersistValueJSON': -- -- @ -- instance PersistField MyData where -- fromPersistValue = fromPersistValueJSON -- toPersistValue = toPersistValueJSON -- @ fromPersistValueJSON :: FromJSON a => PersistValue -> Either Text a fromPersistValueJSON z = case z of PersistByteString bs -> mapLeft (T.append "Could not parse the JSON (was a PersistByteString): ") $ parseGo bs PersistText t -> mapLeft (T.append "Could not parse the JSON (was PersistText): ") $ parseGo (TE.encodeUtf8 t) a -> Left $ T.append "Expected PersistByteString, received: " (T.pack (show a)) where parseGo bs = mapLeft T.pack $ case parseOnly AP.value bs of Left err -> Left err Right v -> case fromJSON v of Error err -> Left err Success a -> Right a mapLeft _ (Right a) = Right a mapLeft f (Left b) = Left (f b) -- | Convenience function for getting a free 'PersistField' instance -- from a type with an 'Enum' instance. The function 'derivePersistField' -- from the persistent-template package should generally be preferred. -- However, if you want to ensure that an @ORDER BY@ clause that uses -- your field will order rows by the data constructor order, this is -- a better choice. -- -- Example usage in combination with 'fromPersistValueEnum': -- -- @ -- data SeverityLevel = Low | Medium | Critical | High -- deriving (Enum, Bounded) -- instance PersistField SeverityLevel where -- fromPersistValue = fromPersistValueEnum -- toPersistValue = toPersistValueEnum -- @ toPersistValueEnum :: Enum a => a -> PersistValue toPersistValueEnum = toPersistValue . fromEnum -- | Convenience function for getting a free 'PersistField' instance -- from a type with an 'Enum' instance. This function also requires -- a `Bounded` instance to improve the reporting of errors. -- -- Example usage in combination with 'toPersistValueEnum': -- -- @ -- data SeverityLevel = Low | Medium | Critical | High -- deriving (Enum, Bounded) -- instance PersistField SeverityLevel where -- fromPersistValue = fromPersistValueEnum -- toPersistValue = toPersistValueEnum -- @ fromPersistValueEnum :: (Enum a, Bounded a) => PersistValue -> Either Text a fromPersistValueEnum v = fromPersistValue v >>= go where go i = let res = toEnum i in if i >= fromEnum (asTypeOf minBound res) && i <= fromEnum (asTypeOf maxBound res) then Right res else Left ("The number " `mappend` T.pack (show i) `mappend` " was out of the " `mappend` "allowed bounds for an enum type") persistent-2.10.5.2/Database/Persist/Class/PersistQuery.hs0000644000000000000000000001026613464141060021546 0ustar0000000000000000module Database.Persist.Class.PersistQuery ( PersistQueryRead (..) , PersistQueryWrite (..) , selectSource , selectKeys , selectList , selectKeysList ) where import Control.Monad.IO.Class (MonadIO, liftIO) import Control.Monad.Reader (ReaderT, MonadReader) import Control.Monad.Trans.Resource (MonadResource, release) import Data.Acquire (Acquire, allocateAcquire, with) import Data.Conduit (ConduitM, (.|), await, runConduit) import qualified Data.Conduit.List as CL import Database.Persist.Class.PersistStore import Database.Persist.Class.PersistEntity -- | Backends supporting conditional read operations. class (PersistCore backend, PersistStoreRead backend) => PersistQueryRead backend where -- | Get all records matching the given criterion in the specified order. -- Returns also the identifiers. selectSourceRes :: (PersistRecordBackend record backend, MonadIO m1, MonadIO m2) => [Filter record] -> [SelectOpt record] -> ReaderT backend m1 (Acquire (ConduitM () (Entity record) m2 ())) -- | Get just the first record for the criterion. selectFirst :: (MonadIO m, PersistRecordBackend record backend) => [Filter record] -> [SelectOpt record] -> ReaderT backend m (Maybe (Entity record)) selectFirst filts opts = do srcRes <- selectSourceRes filts (LimitTo 1 : opts) liftIO $ with srcRes (\src -> runConduit $ src .| await) -- | Get the 'Key's of all records matching the given criterion. selectKeysRes :: (MonadIO m1, MonadIO m2, PersistRecordBackend record backend) => [Filter record] -> [SelectOpt record] -> ReaderT backend m1 (Acquire (ConduitM () (Key record) m2 ())) -- | The total number of records fulfilling the given criterion. count :: (MonadIO m, PersistRecordBackend record backend) => [Filter record] -> ReaderT backend m Int -- | Backends supporting conditional write operations class (PersistQueryRead backend, PersistStoreWrite backend) => PersistQueryWrite backend where -- | Update individual fields on any record matching the given criterion. updateWhere :: (MonadIO m, PersistRecordBackend record backend) => [Filter record] -> [Update record] -> ReaderT backend m () -- | Delete all records matching the given criterion. deleteWhere :: (MonadIO m, PersistRecordBackend record backend) => [Filter record] -> ReaderT backend m () -- | Get all records matching the given criterion in the specified order. -- Returns also the identifiers. selectSource :: (PersistQueryRead backend, MonadResource m, PersistRecordBackend record backend, MonadReader backend m) => [Filter record] -> [SelectOpt record] -> ConduitM () (Entity record) m () selectSource filts opts = do srcRes <- liftPersist $ selectSourceRes filts opts (releaseKey, src) <- allocateAcquire srcRes src release releaseKey -- | Get the 'Key's of all records matching the given criterion. selectKeys :: (PersistQueryRead backend, MonadResource m, PersistRecordBackend record backend, MonadReader backend m) => [Filter record] -> [SelectOpt record] -> ConduitM () (Key record) m () selectKeys filts opts = do srcRes <- liftPersist $ selectKeysRes filts opts (releaseKey, src) <- allocateAcquire srcRes src release releaseKey -- | Call 'selectSource' but return the result as a list. selectList :: (MonadIO m, PersistQueryRead backend, PersistRecordBackend record backend) => [Filter record] -> [SelectOpt record] -> ReaderT backend m [Entity record] selectList filts opts = do srcRes <- selectSourceRes filts opts liftIO $ with srcRes (\src -> runConduit $ src .| CL.consume) -- | Call 'selectKeys' but return the result as a list. selectKeysList :: (MonadIO m, PersistQueryRead backend, PersistRecordBackend record backend) => [Filter record] -> [SelectOpt record] -> ReaderT backend m [Key record] selectKeysList filts opts = do srcRes <- selectKeysRes filts opts liftIO $ with srcRes (\src -> runConduit $ src .| CL.consume) persistent-2.10.5.2/Database/Persist/Class/PersistUnique.hs0000644000000000000000000005464713563026164021731 0ustar0000000000000000{-# LANGUAGE DataKinds #-} {-# LANGUAGE ScopedTypeVariables #-} {-# LANGUAGE TypeOperators #-} module Database.Persist.Class.PersistUnique ( PersistUniqueRead(..) , PersistUniqueWrite(..) , OnlyOneUniqueKey(..) , onlyOneUniqueDef , AtLeastOneUniqueKey(..) , atLeastOneUniqueDef , NoUniqueKeysError , MultipleUniqueKeysError , getByValue , getByValueUniques , insertBy , insertUniqueEntity , replaceUnique , checkUnique , onlyUnique , defaultPutMany , persistUniqueKeyValues ) where import Control.Monad (liftM) import Control.Monad.IO.Class (MonadIO) import Control.Monad.Trans.Reader (ReaderT) import Data.Function (on) import Data.List ((\\), deleteFirstsBy) import Data.List.NonEmpty (NonEmpty(..)) import qualified Data.List.NonEmpty as NEL import qualified Data.Map as Map import Data.Maybe (catMaybes) import Data.Text (Text) import GHC.TypeLits (ErrorMessage(..)) import Database.Persist.Types import Database.Persist.Class.PersistStore import Database.Persist.Class.PersistEntity -- | Queries against 'Unique' keys (other than the id 'Key'). -- -- Please read the general Persistent documentation to learn how to create -- 'Unique' keys. -- -- Using this with an Entity without a Unique key leads to undefined -- behavior. A few of these functions require a /single/ 'Unique', so using -- an Entity with multiple 'Unique's is also undefined. In these cases -- persistent's goal is to throw an exception as soon as possible, but -- persistent is still transitioning to that. -- -- SQL backends automatically create uniqueness constraints, but for MongoDB -- you must manually place a unique index on a field to have a uniqueness -- constraint. -- class (PersistCore backend, PersistStoreRead backend) => PersistUniqueRead backend where -- | Get a record by unique key, if available. Returns also the identifier. -- -- === __Example usage__ -- -- With <#schema-persist-unique-1 schema-1> and <#dataset-persist-unique-1 dataset-1>: -- -- > getBySpjName :: MonadIO m => ReaderT SqlBackend m (Maybe (Entity User)) -- > getBySpjName = getBy $ UniqueUserName "SPJ" -- -- > mSpjEnt <- getBySpjName -- -- The above query when applied on <#dataset-persist-unique-1 dataset-1>, will get this entity: -- -- > +----+------+-----+ -- > | id | name | age | -- > +----+------+-----+ -- > | 1 | SPJ | 40 | -- > +----+------+-----+ getBy :: (MonadIO m, PersistRecordBackend record backend) => Unique record -> ReaderT backend m (Maybe (Entity record)) -- | Some functions in this module ('insertUnique', 'insertBy', and -- 'replaceUnique') first query the unique indexes to check for -- conflicts. You could instead optimistically attempt to perform the -- operation (e.g. 'replace' instead of 'replaceUnique'). However, -- -- * there is some fragility to trying to catch the correct exception and -- determing the column of failure; -- -- * an exception will automatically abort the current SQL transaction. class (PersistUniqueRead backend, PersistStoreWrite backend) => PersistUniqueWrite backend where -- | Delete a specific record by unique key. Does nothing if no record -- matches. -- -- === __Example usage__ -- -- With <#schema-persist-unique-1 schema-1> and <#dataset-persist-unique-1 dataset-1>, -- -- > deleteBySpjName :: MonadIO m => ReaderT SqlBackend m () -- > deleteBySpjName = deleteBy UniqueUserName "SPJ" -- -- The above query when applied on <#dataset-persist-unique-1 dataset-1>, will produce this: -- -- > +-----+------+-----+ -- > |id |name |age | -- > +-----+------+-----+ -- > |2 |Simon |41 | -- > +-----+------+-----+ deleteBy :: (MonadIO m, PersistRecordBackend record backend) => Unique record -> ReaderT backend m () -- | Like 'insert', but returns 'Nothing' when the record -- couldn't be inserted because of a uniqueness constraint. -- -- === __Example usage__ -- -- With <#schema-persist-unique-1 schema-1> and <#dataset-persist-unique-1 dataset-1>, we try to insert the following two records: -- -- > linusId <- insertUnique $ User "Linus" 48 -- > spjId <- insertUnique $ User "SPJ" 90 -- -- > +-----+------+-----+ -- > |id |name |age | -- > +-----+------+-----+ -- > |1 |SPJ |40 | -- > +-----+------+-----+ -- > |2 |Simon |41 | -- > +-----+------+-----+ -- > |3 |Linus |48 | -- > +-----+------+-----+ -- -- Linus's record was inserted to <#dataset-persist-unique-1 dataset-1>, while SPJ wasn't because SPJ already exists in <#dataset-persist-unique-1 dataset-1>. insertUnique :: (MonadIO m, PersistRecordBackend record backend) => record -> ReaderT backend m (Maybe (Key record)) insertUnique datum = do conflict <- checkUnique datum case conflict of Nothing -> Just `liftM` insert datum Just _ -> return Nothing -- | Update based on a uniqueness constraint or insert: -- -- * insert the new record if it does not exist; -- * If the record exists (matched via it's uniqueness constraint), then update the existing record with the parameters which is passed on as list to the function. -- -- === __Example usage__ -- -- First, we try to explain 'upsert' using <#schema-persist-unique-1 schema-1> and <#dataset-persist-unique-1 dataset-1>. -- -- > upsertSpj :: MonadIO m => [Update User] -> ReaderT SqlBackend m (Maybe (Entity User)) -- > upsertSpj updates = upsert (User "SPJ" 999) upadtes -- -- > mSpjEnt <- upsertSpj [UserAge +=. 15] -- -- The above query when applied on <#dataset-persist-unique-1 dataset-1>, will produce this: -- -- > +-----+-----+--------+ -- > |id |name |age | -- > +-----+-----+--------+ -- > |1 |SPJ |40 -> 55| -- > +-----+-----+--------+ -- > |2 |Simon|41 | -- > +-----+-----+--------+ -- -- > upsertX :: MonadIO m => [Update User] -> ReaderT SqlBackend m (Maybe (Entity User)) -- > upsertX updates = upsert (User "X" 999) updates -- -- > mXEnt <- upsertX [UserAge +=. 15] -- -- The above query when applied on <#dataset-persist-unique-1 dataset-1>, will produce this: -- -- > +-----+-----+--------+ -- > |id |name |age | -- > +-----+-----+--------+ -- > |1 |SPJ |40 | -- > +-----+-----+--------+ -- > |2 |Simon|41 | -- > +-----+-----+--------+ -- > |3 |X |999 | -- > +-----+-----+--------+ -- -- Next, what if the schema has two uniqueness constraints? -- Let's check it out using <#schema-persist-unique-2 schema-2>: -- -- > mSpjEnt <- upsertSpj [UserAge +=. 15] -- -- This fails with a compile-time type error alerting us to the fact -- that this record has multiple unique keys, and suggests that we look for -- 'upsertBy' to select the unique key we want. upsert :: (MonadIO m, PersistRecordBackend record backend, OnlyOneUniqueKey record) => record -- ^ new record to insert -> [Update record] -- ^ updates to perform if the record already exists -> ReaderT backend m (Entity record) -- ^ the record in the database after the operation upsert record updates = do uniqueKey <- onlyUnique record upsertBy uniqueKey record updates -- | Update based on a given uniqueness constraint or insert: -- -- * insert the new record if it does not exist; -- * update the existing record that matches the given uniqueness constraint. -- -- === __Example usage__ -- -- We try to explain 'upsertBy' using <#schema-persist-unique-2 schema-2> and <#dataset-persist-unique-1 dataset-1>. -- -- > upsertBySpjName :: MonadIO m => User -> [Update User] -> ReaderT SqlBackend m (Entity User) -- > upsertBySpjName record updates = upsertBy (UniqueUserName "SPJ") record updates -- -- > mSpjEnt <- upsertBySpjName (Person "X" 999) [PersonAge += .15] -- -- The above query will alter <#dataset-persist-unique-1 dataset-1> to: -- -- > +-----+-----+--------+ -- > |id |name |age | -- > +-----+-----+--------+ -- > |1 |SPJ |40 -> 55| -- > +-----+-----+--------+ -- > |2 |Simon|41 | -- > +-----+-----+--------+ -- -- > upsertBySimonAge :: MonadIO m => User -> [Update User] -> ReaderT SqlBackend m (Entity User) -- > upsertBySimonAge record updates = upsertBy (UniqueUserName "SPJ") record updates -- -- > mPhilipEnt <- upsertBySimonAge (User "X" 999) [UserName =. "Philip"] -- -- The above query will alter <#dataset-persist-unique-1 dataset-1> to: -- -- > +----+-----------------+-----+ -- > | id | name | age | -- > +----+-----------------+-----+ -- > | 1 | SPJ | 40 | -- > +----+-----------------+-----+ -- > | 2 | Simon -> Philip | 41 | -- > +----+-----------------+-----+ -- -- > upsertByUnknownName :: MonadIO m => User -> [Update User] -> ReaderT SqlBackend m (Entity User) -- > upsertByUnknownName record updates = upsertBy (UniqueUserName "Unknown") record updates -- -- > mXEnt <- upsertByUnknownName (User "X" 999) [UserAge +=. 15] -- -- This query will alter <#dataset-persist-unique-1 dataset-1> to: -- -- > +-----+-----+-----+ -- > |id |name |age | -- > +-----+-----+-----+ -- > |1 |SPJ |40 | -- > +-----+-----+-----+ -- > |2 |Simon|41 | -- > +-----+-----+-----+ -- > |3 |X |999 | -- > +-----+-----+-----+ upsertBy :: (MonadIO m, PersistRecordBackend record backend) => Unique record -- ^ uniqueness constraint to find by -> record -- ^ new record to insert -> [Update record] -- ^ updates to perform if the record already exists -> ReaderT backend m (Entity record) -- ^ the record in the database after the operation upsertBy uniqueKey record updates = do mrecord <- getBy uniqueKey maybe (insertEntity record) (`updateGetEntity` updates) mrecord where updateGetEntity (Entity k _) upds = (Entity k) `liftM` (updateGet k upds) -- | Put many records into db -- -- * insert new records that do not exist (or violate any unique constraints) -- * replace existing records (matching any unique constraint) -- -- @since 2.8.1 putMany :: ( MonadIO m , PersistRecordBackend record backend ) => [record] -- ^ A list of the records you want to insert or replace. -> ReaderT backend m () putMany = defaultPutMany -- | This class is used to ensure that 'upsert' is only called on records -- that have a single 'Unique' key. The quasiquoter automatically generates -- working instances for appropriate records, and generates 'TypeError' -- instances for records that have 0 or multiple unique keys. -- -- @since 2.10.0 class PersistEntity record => OnlyOneUniqueKey record where onlyUniqueP :: record -> Unique record -- | Given a proxy for a 'PersistEntity' record, this returns the sole -- 'UniqueDef' for that entity. -- -- @since 2.10.0 onlyOneUniqueDef :: (OnlyOneUniqueKey record, Monad proxy) => proxy record -> UniqueDef onlyOneUniqueDef prxy = case entityUniques (entityDef prxy) of [uniq] -> uniq _ -> error "impossible due to OnlyOneUniqueKey constraint" -- | This is an error message. It is used when writing instances of -- 'OnlyOneUniqueKey' for an entity that has no unique keys. -- -- @since 2.10.0 type NoUniqueKeysError ty = 'Text "The entity " ':<>: 'ShowType ty ':<>: 'Text " does not have any unique keys." ':$$: 'Text "The function you are trying to call requires a unique key " ':<>: 'Text "to be defined on the entity." -- | This is an error message. It is used when an entity has multiple -- unique keys, and the function expects a single unique key. -- -- @since 2.10.0 type MultipleUniqueKeysError ty = 'Text "The entity " ':<>: 'ShowType ty ':<>: 'Text " has multiple unique keys." ':$$: 'Text "The function you are trying to call requires only a single " ':<>: 'Text "unique key." ':$$: 'Text "There is probably a variant of the function with 'By' " ':<>: 'Text "appended that will allow you to select a unique key " ':<>: 'Text "for the operation." -- | This class is used to ensure that functions requring at least one -- unique key are not called with records that have 0 unique keys. The -- quasiquoter automatically writes working instances for appropriate -- entities, and generates 'TypeError' instances for records that have -- 0 unique keys. -- -- @since 2.10.0 class PersistEntity record => AtLeastOneUniqueKey record where requireUniquesP :: record -> NonEmpty (Unique record) -- | Given a proxy for a record that has an instance of -- 'AtLeastOneUniqueKey', this returns a 'NonEmpty' list of the -- 'UniqueDef's for that entity. -- -- @since 2.10.0 atLeastOneUniqueDef :: (AtLeastOneUniqueKey record, Monad proxy) => proxy record -> NonEmpty UniqueDef atLeastOneUniqueDef prxy = case entityUniques (entityDef prxy) of (x:xs) -> x :| xs _ -> error "impossible due to AtLeastOneUniqueKey record constraint" -- | Insert a value, checking for conflicts with any unique constraints. If a -- duplicate exists in the database, it is returned as 'Left'. Otherwise, the -- new 'Key is returned as 'Right'. -- -- === __Example usage__ -- -- With <#schema-persist-unique-2 schema-2> and <#dataset-persist-unique-1 dataset-1>, we have following lines of code: -- -- > l1 <- insertBy $ User "SPJ" 20 -- > l2 <- insertBy $ User "XXX" 41 -- > l3 <- insertBy $ User "SPJ" 40 -- > r1 <- insertBy $ User "XXX" 100 -- -- First three lines return 'Left' because there're duplicates in given record's uniqueness constraints. While the last line returns a new key as 'Right'. insertBy :: ( MonadIO m , PersistUniqueWrite backend , PersistRecordBackend record backend , AtLeastOneUniqueKey record ) => record -> ReaderT backend m (Either (Entity record) (Key record)) insertBy val = do res <- getByValue val case res of Nothing -> Right `liftM` insert val Just z -> return $ Left z -- | Like 'insertEntity', but returns 'Nothing' when the record -- couldn't be inserted because of a uniqueness constraint. -- -- @since 2.7.1 -- -- === __Example usage__ -- -- We use <#schema-persist-unique-2 schema-2> and <#dataset-persist-unique-1 dataset-1> here. -- -- > insertUniqueSpjEntity :: MonadIO m => ReaderT SqlBackend m (Maybe (Entity User)) -- > insertUniqueSpjEntity = insertUniqueEntity $ User "SPJ" 50 -- -- > mSpjEnt <- insertUniqueSpjEntity -- -- The above query results 'Nothing' as SPJ already exists. -- -- > insertUniqueAlexaEntity :: MonadIO m => ReaderT SqlBackend m (Maybe (Entity User)) -- > insertUniqueAlexaEntity = insertUniqueEntity $ User "Alexa" 3 -- -- > mAlexaEnt <- insertUniqueSpjEntity -- -- Because there's no such unique keywords of the given record, the above query when applied on <#dataset-persist-unique-1 dataset-1>, will produce this: -- -- > +----+-------+-----+ -- > | id | name | age | -- > +----+-------+-----+ -- > | 1 | SPJ | 40 | -- > +----+-------+-----+ -- > | 2 | Simon | 41 | -- > +----+-------+-----+ -- > | 3 | Alexa | 3 | -- > +----+-------+-----+ insertUniqueEntity :: (MonadIO m ,PersistRecordBackend record backend ,PersistUniqueWrite backend) => record -> ReaderT backend m (Maybe (Entity record)) insertUniqueEntity datum = fmap (\key -> Entity key datum) `liftM` insertUnique datum -- | Return the single unique key for a record. -- -- === __Example usage__ -- -- We use shcema-1 and <#dataset-persist-unique-1 dataset-1> here. -- -- > onlySimonConst :: MonadIO m => ReaderT SqlBackend m (Unique User) -- > onlySimonConst = onlyUnique $ User "Simon" 999 -- -- > mSimonConst <- onlySimonConst -- -- @mSimonConst@ would be Simon's uniqueness constraint. Note that -- @onlyUnique@ doesn't work if there're more than two constraints. It will -- fail with a type error instead. onlyUnique :: ( MonadIO m , PersistUniqueWrite backend , PersistRecordBackend record backend , OnlyOneUniqueKey record ) => record -> ReaderT backend m (Unique record) onlyUnique = pure . onlyUniqueP -- | A modification of 'getBy', which takes the 'PersistEntity' itself instead -- of a 'Unique' record. Returns a record matching /one/ of the unique keys. This -- function makes the most sense on entities with a single 'Unique' -- constructor. -- -- === __Example usage__ -- -- With <#schema-persist-unique-1 schema-1> and <#dataset-persist-unique-1 dataset-1>, -- -- getBySpjValue :: MonadIO m => ReaderT SqlBackend m (Maybe (Entity User)) -- getBySpjValue = getByValue $ User "SPJ" 999 -- -- > mSpjEnt <- getBySpjValue -- -- The above query when applied on <#dataset-persist-unique-1 dataset-1>, will get this record: -- -- > +----+------+-----+ -- > | id | name | age | -- > +----+------+-----+ -- > | 1 | SPJ | 40 | -- > +----+------+-----+ getByValue :: forall record m backend. ( MonadIO m , PersistUniqueRead backend , PersistRecordBackend record backend , AtLeastOneUniqueKey record ) => record -> ReaderT backend m (Maybe (Entity record)) getByValue record = do let uniqs = requireUniquesP record getByValueUniques (NEL.toList uniqs) -- | Retrieve a record from the database using the given unique keys. It -- will attempt to find a matching record for each 'Unique' in the list, -- and returns the first one that has a match. -- -- Returns 'Nothing' if you provide an empty list ('[]') or if no value -- matches in the database. -- -- @since 2.10.0 getByValueUniques :: ( MonadIO m , PersistUniqueRead backend , PersistRecordBackend record backend ) => [Unique record] -> ReaderT backend m (Maybe (Entity record)) getByValueUniques uniqs = checkUniques uniqs where checkUniques [] = return Nothing checkUniques (x:xs) = do y <- getBy x case y of Nothing -> checkUniques xs Just z -> return $ Just z -- | Attempt to replace the record of the given key with the given new record. -- First query the unique fields to make sure the replacement maintains -- uniqueness constraints. -- -- Return 'Nothing' if the replacement was made. -- If uniqueness is violated, return a 'Just' with the 'Unique' violation -- -- @since 1.2.2.0 replaceUnique :: ( MonadIO m , Eq (Unique record) , PersistRecordBackend record backend , PersistUniqueWrite backend ) => Key record -> record -> ReaderT backend m (Maybe (Unique record)) replaceUnique key datumNew = getJust key >>= replaceOriginal where uniqueKeysNew = persistUniqueKeys datumNew replaceOriginal original = do conflict <- checkUniqueKeys changedKeys case conflict of Nothing -> replace key datumNew >> return Nothing (Just conflictingKey) -> return $ Just conflictingKey where changedKeys = uniqueKeysNew \\ uniqueKeysOriginal uniqueKeysOriginal = persistUniqueKeys original -- | Check whether there are any conflicts for unique keys with this entity and -- existing entities in the database. -- -- Returns 'Nothing' if the entity would be unique, and could thus safely be inserted. -- on a conflict returns the conflicting key -- -- === __Example usage__ -- -- We use <#schema-persist-unique-1 schema-1> and <#dataset-persist-unique-1 dataset-1> here. -- -- This would be 'Nothing': -- -- > mAlanConst <- checkUnique $ User "Alan" 70 -- -- While this would be 'Just' because SPJ already exists: -- -- > mSpjConst <- checkUnique $ User "SPJ" 60 checkUnique :: (MonadIO m ,PersistRecordBackend record backend ,PersistUniqueRead backend) => record -> ReaderT backend m (Maybe (Unique record)) checkUnique = checkUniqueKeys . persistUniqueKeys checkUniqueKeys :: (MonadIO m ,PersistUniqueRead backend ,PersistRecordBackend record backend) => [Unique record] -> ReaderT backend m (Maybe (Unique record)) checkUniqueKeys [] = return Nothing checkUniqueKeys (x:xs) = do y <- getBy x case y of Nothing -> checkUniqueKeys xs Just _ -> return (Just x) -- | The slow but generic 'putMany' implemetation for any 'PersistUniqueRead'. -- * Lookup corresponding entities (if any) for each record using 'getByValue' -- * For pre-existing records, issue a 'replace' for each old key and new record -- * For new records, issue a bulk 'insertMany_' defaultPutMany ::( PersistEntityBackend record ~ BaseBackend backend , PersistEntity record , MonadIO m , PersistStoreWrite backend , PersistUniqueRead backend ) => [record] -> ReaderT backend m () defaultPutMany [] = return () defaultPutMany rsD@(e:_) = do case persistUniqueKeys e of [] -> insertMany_ rsD _ -> go where go = do -- deduplicate the list of records in Haskell by unique key. The -- previous implementation used Data.List.nubBy which is O(n^2) -- complexity. let rs = map snd . Map.toList . Map.fromList . map (\r -> (persistUniqueKeyValues r, r)) $ rsD -- lookup record(s) by their unique key mEsOld <- mapM (getByValueUniques . persistUniqueKeys) rs -- find pre-existing entities and corresponding (incoming) records let merge (Just x) y = Just (x, y) merge _ _ = Nothing let mEsOldAndRs = zipWith merge mEsOld rs let esOldAndRs = catMaybes mEsOldAndRs -- determine records to insert let esOld = fmap fst esOldAndRs let rsOld = fmap entityVal esOld let rsNew = deleteFirstsBy ((==) `on` persistUniqueKeyValues) rs rsOld -- determine records to update let rsUpd = fmap snd esOldAndRs let ksOld = fmap entityKey esOld let krs = zip ksOld rsUpd -- insert `new` records insertMany_ rsNew -- replace existing records mapM_ (uncurry replace) krs -- | This function returns a list of 'PersistValue' that correspond to the -- 'Unique' keys on that record. This is useful for comparing two @record@s -- for equality only on the basis of their 'Unique' keys. persistUniqueKeyValues :: PersistEntity record => record -> [PersistValue] persistUniqueKeyValues = concatMap persistUniqueToValues . persistUniqueKeys persistent-2.10.5.2/Database/Persist/Class/PersistConfig.hs0000644000000000000000000000370513464141060021646 0ustar0000000000000000module Database.Persist.Class.PersistConfig ( PersistConfig (..) ) where import Control.Monad.IO.Unlift (MonadUnliftIO) import Data.Aeson (Value (Object)) import Data.Aeson.Types (Parser) import qualified Data.HashMap.Strict as HashMap -- | Represents a value containing all the configuration options for a specific -- backend. This abstraction makes it easier to write code that can easily swap -- backends. class PersistConfig c where type PersistConfigBackend c :: (* -> *) -> * -> * type PersistConfigPool c -- | Load the config settings from a 'Value', most likely taken from a YAML -- config file. loadConfig :: Value -> Parser c -- | Modify the config settings based on environment variables. applyEnv :: c -> IO c applyEnv = return -- | Create a new connection pool based on the given config settings. createPoolConfig :: c -> IO (PersistConfigPool c) -- | Run a database action by taking a connection from the pool. runPool :: MonadUnliftIO m => c -> PersistConfigBackend c m a -> PersistConfigPool c -> m a instance ( PersistConfig c1 , PersistConfig c2 , PersistConfigPool c1 ~ PersistConfigPool c2 , PersistConfigBackend c1 ~ PersistConfigBackend c2 ) => PersistConfig (Either c1 c2) where type PersistConfigBackend (Either c1 c2) = PersistConfigBackend c1 type PersistConfigPool (Either c1 c2) = PersistConfigPool c1 loadConfig (Object o) = case HashMap.lookup "left" o of Just v -> Left <$> loadConfig v Nothing -> case HashMap.lookup "right" o of Just v -> Right <$> loadConfig v Nothing -> fail "PersistConfig for Either: need either a left or right" loadConfig _ = fail "PersistConfig for Either: need an object" createPoolConfig = either createPoolConfig createPoolConfig runPool (Left c) = runPool c runPool (Right c) = runPool c persistent-2.10.5.2/Database/Persist/Class/PersistField.hs0000644000000000000000000005233313620641325021470 0ustar0000000000000000{-# LANGUAGE CPP #-} {-# LANGUAGE ExistentialQuantification #-} {-# LANGUAGE FlexibleInstances #-} {-# LANGUAGE PatternGuards #-} module Database.Persist.Class.PersistField ( PersistField (..) , SomePersistField (..) , getPersistMap ) where import Control.Arrow (second) import Control.Monad ((<=<)) import qualified Data.Aeson as A import Data.ByteString.Char8 (ByteString, unpack, readInt) import qualified Data.ByteString.Lazy as L import Data.Fixed import Data.Int (Int8, Int16, Int32, Int64) import qualified Data.IntMap as IM import qualified Data.Map as M import Data.Monoid ((<>)) import qualified Data.Set as S import Data.Text (Text) import qualified Data.Text as T import Data.Text.Read (double) import qualified Data.Text.Encoding as TE import qualified Data.Text.Encoding.Error as TERR import qualified Data.Text.Lazy as TL import qualified Data.Vector as V import Data.Word (Word, Word8, Word16, Word32, Word64) import Numeric.Natural (Natural) import Text.Blaze.Html import Text.Blaze.Html.Renderer.Text (renderHtml) import Database.Persist.Types.Base import Data.Time (Day(..), TimeOfDay, UTCTime, parseTimeM) import Data.Time (defaultTimeLocale) #ifdef HIGH_PRECISION_DATE import Data.Time.Clock.POSIX (posixSecondsToUTCTime) #endif -- | This class teaches Persistent how to take a custom type and marshal it to and from a 'PersistValue', allowing it to be stored in a database. -- -- ==== __Examples__ -- -- ===== Simple Newtype -- -- You can use @newtype@ to add more type safety/readability to a basis type like 'ByteString'. In these cases, just derive 'PersistField' and @PersistFieldSql@: -- -- @ -- {-\# LANGUAGE GeneralizedNewtypeDeriving #-} -- -- newtype HashedPassword = HashedPassword 'ByteString' -- deriving (Eq, Show, 'PersistField', PersistFieldSql) -- @ -- -- ===== Smart Constructor Newtype -- -- In this example, we create a 'PersistField' instance for a newtype following the "Smart Constructor" pattern. -- -- @ -- {-\# LANGUAGE GeneralizedNewtypeDeriving #-} -- import qualified "Data.Text" as T -- import qualified "Data.Char" as C -- -- -- | An American Social Security Number -- newtype SSN = SSN 'Text' -- deriving (Eq, Show, PersistFieldSql) -- -- mkSSN :: 'Text' -> 'Either' 'Text' SSN -- mkSSN t = if (T.length t == 9) && (T.all C.isDigit t) -- then 'Right' $ SSN t -- else 'Left' $ "Invalid SSN: " <> t -- -- instance 'PersistField' SSN where -- 'toPersistValue' (SSN t) = 'PersistText' t -- 'fromPersistValue' ('PersistText' t) = mkSSN t -- -- Handle cases where the database does not give us PersistText -- 'fromPersistValue' x = 'Left' $ "File.hs: When trying to deserialize an SSN: expected PersistText, received: " <> T.pack (show x) -- @ -- -- Tips: -- -- * This file contain dozens of 'PersistField' instances you can look at for examples. -- * Typically custom 'PersistField' instances will only accept a single 'PersistValue' constructor in 'fromPersistValue'. -- * Internal 'PersistField' instances accept a wide variety of 'PersistValue's to accomodate e.g. storing booleans as integers, booleans or strings. -- * If you're making a custom instance and using a SQL database, you'll also need @PersistFieldSql@ to specify the type of the database column. class PersistField a where toPersistValue :: a -> PersistValue fromPersistValue :: PersistValue -> Either T.Text a #ifndef NO_OVERLAP instance {-# OVERLAPPING #-} PersistField [Char] where toPersistValue = PersistText . T.pack fromPersistValue (PersistText s) = Right $ T.unpack s fromPersistValue (PersistByteString bs) = Right $ T.unpack $ TE.decodeUtf8With TERR.lenientDecode bs fromPersistValue (PersistInt64 i) = Right $ Prelude.show i fromPersistValue (PersistDouble d) = Right $ Prelude.show d fromPersistValue (PersistRational r) = Right $ Prelude.show r fromPersistValue (PersistDay d) = Right $ Prelude.show d fromPersistValue (PersistTimeOfDay d) = Right $ Prelude.show d fromPersistValue (PersistUTCTime d) = Right $ Prelude.show d fromPersistValue PersistNull = Left $ T.pack "Unexpected null" fromPersistValue (PersistBool b) = Right $ Prelude.show b fromPersistValue (PersistList _) = Left $ T.pack "Cannot convert PersistList to String" fromPersistValue (PersistMap _) = Left $ T.pack "Cannot convert PersistMap to String" fromPersistValue (PersistDbSpecific _) = Left $ T.pack "Cannot convert PersistDbSpecific to String. See the documentation of PersistDbSpecific for an example of using a custom database type with Persistent." fromPersistValue (PersistArray _) = Left $ T.pack "Cannot convert PersistArray to String" fromPersistValue (PersistObjectId _) = Left $ T.pack "Cannot convert PersistObjectId to String" #endif instance PersistField ByteString where toPersistValue = PersistByteString fromPersistValue (PersistByteString bs) = Right bs fromPersistValue x = TE.encodeUtf8 <$> fromPersistValue x instance PersistField T.Text where toPersistValue = PersistText fromPersistValue = fromPersistValueText instance PersistField TL.Text where toPersistValue = toPersistValue . TL.toStrict fromPersistValue = fmap TL.fromStrict . fromPersistValue instance PersistField Html where toPersistValue = PersistText . TL.toStrict . renderHtml fromPersistValue = fmap (preEscapedToMarkup :: T.Text -> Html) . fromPersistValue instance PersistField Int where toPersistValue = PersistInt64 . fromIntegral fromPersistValue (PersistInt64 i) = Right $ fromIntegral i fromPersistValue (PersistDouble i) = Right (truncate i :: Int) -- oracle fromPersistValue x = Left $ fromPersistValueError "Int" "integer" x instance PersistField Int8 where toPersistValue = PersistInt64 . fromIntegral fromPersistValue (PersistInt64 i) = Right $ fromIntegral i fromPersistValue (PersistDouble i) = Right (truncate i :: Int8) -- oracle fromPersistValue (PersistByteString bs) = case readInt bs of -- oracle Just (i,"") -> Right $ fromIntegral i Just (i,extra) -> Left $ extraInputError "Int64" bs i extra Nothing -> Left $ intParseError "Int64" bs fromPersistValue x = Left $ fromPersistValueError "Int8" "integer" x instance PersistField Int16 where toPersistValue = PersistInt64 . fromIntegral fromPersistValue (PersistInt64 i) = Right $ fromIntegral i fromPersistValue (PersistDouble i) = Right (truncate i :: Int16) -- oracle fromPersistValue (PersistByteString bs) = case readInt bs of -- oracle Just (i,"") -> Right $ fromIntegral i Just (i,extra) -> Left $ extraInputError "Int64" bs i extra Nothing -> Left $ intParseError "Int64" bs fromPersistValue x = Left $ fromPersistValueError "Int16" "integer" x instance PersistField Int32 where toPersistValue = PersistInt64 . fromIntegral fromPersistValue (PersistInt64 i) = Right $ fromIntegral i fromPersistValue (PersistDouble i) = Right (truncate i :: Int32) -- oracle fromPersistValue (PersistByteString bs) = case readInt bs of -- oracle Just (i,"") -> Right $ fromIntegral i Just (i,extra) -> Left $ extraInputError "Int64" bs i extra Nothing -> Left $ intParseError "Int64" bs fromPersistValue x = Left $ fromPersistValueError "Int32" "integer" x instance PersistField Int64 where toPersistValue = PersistInt64 fromPersistValue (PersistInt64 i) = Right i fromPersistValue (PersistDouble i) = Right (truncate i :: Int64) -- oracle fromPersistValue (PersistByteString bs) = case readInt bs of -- oracle Just (i,"") -> Right $ fromIntegral i Just (i,extra) -> Left $ extraInputError "Int64" bs i extra Nothing -> Left $ intParseError "Int64" bs fromPersistValue x = Left $ fromPersistValueError "Int64" "integer" x extraInputError :: (Show result) => Text -- ^ Haskell type -> ByteString -- ^ Original bytestring -> result -- ^ Integer result -> ByteString -- ^ Extra bytestring -> Text -- ^ Error message extraInputError haskellType original result extra = T.concat [ "Parsed " , TE.decodeUtf8 original , " into Haskell type `" , haskellType , "` with value" , T.pack $ show result , "but had extra input: " , TE.decodeUtf8 extra ] intParseError :: Text -- ^ Haskell type -> ByteString -- ^ Original bytestring -> Text -- ^ Error message intParseError haskellType original = T.concat [ "Failed to parse Haskell type `" , haskellType , " from " , TE.decodeUtf8 original ] instance PersistField Data.Word.Word where toPersistValue = PersistInt64 . fromIntegral fromPersistValue (PersistInt64 i) = Right $ fromIntegral i fromPersistValue x = Left $ fromPersistValueError "Word" "integer" x instance PersistField Word8 where toPersistValue = PersistInt64 . fromIntegral fromPersistValue (PersistInt64 i) = Right $ fromIntegral i fromPersistValue x = Left $ fromPersistValueError "Word8" "integer" x instance PersistField Word16 where toPersistValue = PersistInt64 . fromIntegral fromPersistValue (PersistInt64 i) = Right $ fromIntegral i fromPersistValue x = Left $ fromPersistValueError "Word16" "integer" x instance PersistField Word32 where toPersistValue = PersistInt64 . fromIntegral fromPersistValue (PersistInt64 i) = Right $ fromIntegral i fromPersistValue x = Left $ fromPersistValueError "Word32" "integer" x instance PersistField Word64 where toPersistValue = PersistInt64 . fromIntegral fromPersistValue (PersistInt64 i) = Right $ fromIntegral i fromPersistValue x = Left $ fromPersistValueError "Word64" "integer" x instance PersistField Double where toPersistValue = PersistDouble fromPersistValue (PersistDouble d) = Right d fromPersistValue (PersistRational r) = Right $ fromRational r fromPersistValue (PersistInt64 i) = Right $ fromIntegral i fromPersistValue x = Left $ fromPersistValueError "Double" "double, rational, or integer" x instance (HasResolution a) => PersistField (Fixed a) where toPersistValue = PersistRational . toRational fromPersistValue (PersistRational r) = Right $ fromRational r fromPersistValue (PersistText t) = case reads $ T.unpack t of -- NOTE: Sqlite can store rationals just as string [(a, "")] -> Right a _ -> Left $ "Can not read " <> t <> " as Fixed" fromPersistValue (PersistDouble d) = Right $ realToFrac d fromPersistValue (PersistInt64 i) = Right $ fromIntegral i fromPersistValue x = Left $ fromPersistValueError "Fixed" "rational, string, double, or integer" x instance PersistField Rational where toPersistValue = PersistRational fromPersistValue (PersistRational r) = Right r fromPersistValue (PersistDouble d) = Right $ toRational d fromPersistValue (PersistText t) = case reads $ T.unpack t of -- NOTE: Sqlite can store rationals just as string [(a, "")] -> Right $ toRational (a :: Pico) _ -> Left $ "Can not read " <> t <> " as Rational (Pico in fact)" fromPersistValue (PersistInt64 i) = Right $ fromIntegral i fromPersistValue (PersistByteString bs) = case double $ T.cons '0' $ TE.decodeUtf8With TERR.lenientDecode bs of Right (ret,"") -> Right $ toRational ret Right (a,b) -> Left $ "Invalid bytestring[" <> T.pack (show bs) <> "]: expected a double but returned " <> T.pack (show (a,b)) Left xs -> Left $ "Invalid bytestring[" <> T.pack (show bs) <> "]: expected a double but returned " <> T.pack (show xs) fromPersistValue x = Left $ fromPersistValueError "Rational" "rational, double, string, integer, or bytestring" x instance PersistField Bool where toPersistValue = PersistBool fromPersistValue (PersistBool b) = Right b fromPersistValue (PersistInt64 i) = Right $ i /= 0 fromPersistValue (PersistByteString i) = case readInt i of Just (0,"") -> Right False Just (1,"") -> Right True xs -> Left $ T.pack $ "Failed to parse Haskell type `Bool` from PersistByteString. Original value:" ++ show i ++ ". Parsed by `readInt` as " ++ (show xs) ++ ". Expected '1'." fromPersistValue x = Left $ fromPersistValueError "Bool" "boolean, integer, or bytestring of '1' or '0'" x instance PersistField Day where toPersistValue = PersistDay fromPersistValue (PersistDay d) = Right d fromPersistValue (PersistInt64 i) = Right $ ModifiedJulianDay $ toInteger i fromPersistValue x@(PersistText t) = case reads $ T.unpack t of (d, _):_ -> Right d _ -> Left $ fromPersistValueParseError "Day" x fromPersistValue x@(PersistByteString s) = case reads $ unpack s of (d, _):_ -> Right d _ -> Left $ fromPersistValueParseError "Day" x fromPersistValue x = Left $ fromPersistValueError "Day" "day, integer, string or bytestring" x instance PersistField TimeOfDay where toPersistValue = PersistTimeOfDay fromPersistValue (PersistTimeOfDay d) = Right d fromPersistValue x@(PersistText t) = case reads $ T.unpack t of (d, _):_ -> Right d _ -> Left $ fromPersistValueParseError "TimeOfDay" x fromPersistValue x@(PersistByteString s) = case reads $ unpack s of (d, _):_ -> Right d _ -> Left $ fromPersistValueParseError "TimeOfDay" x fromPersistValue x = Left $ fromPersistValueError "TimeOfDay" "time, string, or bytestring" x instance PersistField UTCTime where toPersistValue = PersistUTCTime fromPersistValue (PersistUTCTime d) = Right d #ifdef HIGH_PRECISION_DATE fromPersistValue (PersistInt64 i) = Right $ posixSecondsToUTCTime $ (/ (1000 * 1000 * 1000)) $ fromIntegral $ i #endif fromPersistValue x@(PersistText t) = case reads $ T.unpack t of (d, _):_ -> Right d _ -> case parse8601 $ T.unpack t of Nothing -> Left $ fromPersistValueParseError "UTCTime" x Just x' -> Right x' where #if MIN_VERSION_time(1,5,0) parse8601 = parseTimeM True defaultTimeLocale "%FT%T%Q" #else parse8601 = parseTime defaultTimeLocale "%FT%T%Q" #endif fromPersistValue x@(PersistByteString s) = case reads $ unpack s of (d, _):_ -> Right d _ -> Left $ fromPersistValueParseError "UTCTime" x fromPersistValue x = Left $ fromPersistValueError "UTCTime" "time, integer, string, or bytestring" x instance PersistField Natural where toPersistValue = (toPersistValue :: Int64 -> PersistValue) . fromIntegral fromPersistValue x = case (fromPersistValue x :: Either Text Int64) of Left err -> Left $ T.replace "Int64" "Natural" err Right int -> Right $ fromIntegral int -- TODO use bimap? instance PersistField a => PersistField (Maybe a) where toPersistValue Nothing = PersistNull toPersistValue (Just a) = toPersistValue a fromPersistValue PersistNull = Right Nothing fromPersistValue x = Just <$> fromPersistValue x instance {-# OVERLAPPABLE #-} PersistField a => PersistField [a] where toPersistValue = PersistList . fmap toPersistValue fromPersistValue (PersistList l) = fromPersistList l fromPersistValue (PersistText t) = fromPersistValue (PersistByteString $ TE.encodeUtf8 t) fromPersistValue (PersistByteString bs) | Just values <- A.decode' (L.fromChunks [bs]) = fromPersistList values -- avoid the need for a migration to fill in empty lists. -- also useful when Persistent is not the only one filling in the data fromPersistValue (PersistNull) = Right [] fromPersistValue x = Left $ fromPersistValueError "List" "list, string, bytestring or null" x instance PersistField a => PersistField (V.Vector a) where toPersistValue = toPersistValue . V.toList fromPersistValue = either (\e -> Left ("Failed to parse Haskell type `Vector`: " `T.append` e)) (Right . V.fromList) . fromPersistValue instance (Ord a, PersistField a) => PersistField (S.Set a) where toPersistValue = PersistList . fmap toPersistValue . S.toList fromPersistValue (PersistList list) = S.fromList <$> fromPersistList list fromPersistValue (PersistText t) = fromPersistValue (PersistByteString $ TE.encodeUtf8 t) fromPersistValue (PersistByteString bs) | Just values <- A.decode' (L.fromChunks [bs]) = S.fromList <$> fromPersistList values fromPersistValue PersistNull = Right S.empty fromPersistValue x = Left $ fromPersistValueError "Set" "list, string, bytestring or null" x instance (PersistField a, PersistField b) => PersistField (a,b) where toPersistValue (x,y) = PersistList [toPersistValue x, toPersistValue y] fromPersistValue v = case fromPersistValue v of Right [x,y] -> (,) <$> fromPersistValue x <*> fromPersistValue y Left e -> Left e _ -> Left $ T.pack $ "Expected 2 item PersistList, received: " ++ show v instance PersistField v => PersistField (IM.IntMap v) where toPersistValue = toPersistValue . IM.toList fromPersistValue = fmap IM.fromList . fromPersistValue instance PersistField v => PersistField (M.Map T.Text v) where toPersistValue = PersistMap . fmap (second toPersistValue) . M.toList fromPersistValue = fromPersistMap <=< getPersistMap instance PersistField PersistValue where toPersistValue = id fromPersistValue = Right fromPersistList :: PersistField a => [PersistValue] -> Either T.Text [a] fromPersistList = mapM fromPersistValue fromPersistMap :: PersistField v => [(T.Text, PersistValue)] -> Either T.Text (M.Map T.Text v) fromPersistMap = foldShortLeft fromPersistValue [] where -- a fold that short-circuits on Left. foldShortLeft f = go where go acc [] = Right $ M.fromList acc go acc ((k, v):kvs) = case f v of Left e -> Left e Right v' -> go ((k,v'):acc) kvs -- | FIXME Add documentation to that. getPersistMap :: PersistValue -> Either T.Text [(T.Text, PersistValue)] getPersistMap (PersistMap kvs) = Right kvs getPersistMap (PersistText t) = getPersistMap (PersistByteString $ TE.encodeUtf8 t) getPersistMap (PersistByteString bs) | Just pairs <- A.decode' (L.fromChunks [bs]) = Right pairs getPersistMap PersistNull = Right [] getPersistMap x = Left $ fromPersistValueError "[(Text, PersistValue)]" "map, string, bytestring or null" x data SomePersistField = forall a. PersistField a => SomePersistField a instance PersistField SomePersistField where toPersistValue (SomePersistField a) = toPersistValue a fromPersistValue x = fmap SomePersistField (fromPersistValue x :: Either Text Text) instance PersistField Checkmark where toPersistValue Active = PersistBool True toPersistValue Inactive = PersistNull fromPersistValue PersistNull = Right Inactive fromPersistValue (PersistBool True) = Right Active fromPersistValue (PersistInt64 1) = Right Active fromPersistValue (PersistByteString i) = case readInt i of Just (0,"") -> Left "Failed to parse Haskell type `Checkmark`: found `0`, expected `1` or NULL" Just (1,"") -> Right Active xs -> Left $ T.pack $ "Failed to parse Haskell type `Checkmark` from PersistByteString. Original value:" ++ show i ++ ". Parsed by `readInt` as " ++ (show xs) ++ ". Expected '1'." fromPersistValue (PersistBool False) = Left $ T.pack "PersistField Checkmark: found unexpected FALSE value" fromPersistValue other = Left $ fromPersistValueError "Checkmark" "boolean, integer, bytestring or null" other fromPersistValueError :: Text -- ^ Haskell type, should match Haskell name exactly, e.g. "Int64" -> Text -- ^ Database type(s), should appear different from Haskell name, e.g. "integer" or "INT", not "Int". -> PersistValue -- ^ Incorrect value -> Text -- ^ Error message fromPersistValueError haskellType databaseType received = T.concat [ "Failed to parse Haskell type `" , haskellType , "`; expected " , databaseType , " from database, but received: " , T.pack (show received) , ". Potential solution: Check that your database schema matches your Persistent model definitions." ] fromPersistValueParseError :: (Show a) => Text -- ^ Haskell type, should match Haskell name exactly, e.g. "Int64" -> a -- ^ Received value -> Text -- ^ Error message fromPersistValueParseError haskellType received = T.concat [ "Failed to parse Haskell type `" , haskellType , "`, but received " , T.pack (show received) ] persistent-2.10.5.2/Database/Persist/Class/PersistStore.hs0000644000000000000000000006157313464141060021544 0ustar0000000000000000{-# LANGUAGE ConstraintKinds #-} module Database.Persist.Class.PersistStore ( HasPersistBackend (..) , IsPersistBackend (..) , PersistRecordBackend , liftPersist , PersistCore (..) , PersistStoreRead (..) , PersistStoreWrite (..) , getEntity , getJust , getJustEntity , belongsTo , belongsToJust , insertEntity , insertRecord , ToBackendKey(..) , BackendCompatible(..) ) where import Control.Exception (throwIO) import Control.Monad.IO.Class (MonadIO, liftIO) import Control.Monad.Reader (MonadReader (ask), runReaderT) import Control.Monad.Trans.Reader (ReaderT) import qualified Data.Aeson as A import Data.Map (Map) import qualified Data.Map as Map import qualified Data.Maybe as Maybe import qualified Data.Text as T import Database.Persist.Class.PersistEntity import Database.Persist.Class.PersistField import Database.Persist.Types -- | Class which allows the plucking of a @BaseBackend backend@ from some larger type. -- For example, -- @ -- instance HasPersistBackend (SqlReadBackend, Int) where -- type BaseBackend (SqlReadBackend, Int) = SqlBackend -- persistBackend = unSqlReadBackend . fst -- @ class HasPersistBackend backend where type BaseBackend backend persistBackend :: backend -> BaseBackend backend -- | Class which witnesses that @backend@ is essentially the same as @BaseBackend backend@. -- That is, they're isomorphic and @backend@ is just some wrapper over @BaseBackend backend@. class (HasPersistBackend backend) => IsPersistBackend backend where -- | This function is how we actually construct and tag a backend as having read or write capabilities. -- It should be used carefully and only when actually constructing a @backend@. Careless use allows us -- to accidentally run a write query against a read-only database. mkPersistBackend :: BaseBackend backend -> backend -- | This class witnesses that two backend are compatible, and that you can -- convert from the @sub@ backend into the @sup@ backend. This is similar -- to the 'HasPersistBackend' and 'IsPersistBackend' classes, but where you -- don't want to fix the type associated with the 'PersistEntityBackend' of -- a record. -- -- Generally speaking, where you might have: -- -- @ -- foo :: -- ( 'PersistEntity' record -- , 'PeristEntityBackend' record ~ 'BaseBackend' backend -- , 'IsSqlBackend' backend -- ) -- @ -- -- this can be replaced with: -- -- @ -- foo :: -- ( 'PersistEntity' record, -- , 'PersistEntityBackend' record ~ backend -- , 'BackendCompatible' 'SqlBackend' backend -- ) -- @ -- -- This works for 'SqlReadBackend' because of the @instance 'BackendCompatible' 'SqlBackend' 'SqlReadBackend'@, without needing to go through the 'BaseBackend' type family. -- -- Likewise, functions that are currently hardcoded to use 'SqlBackend' can be generalized: -- -- @ -- -- before: -- asdf :: 'ReaderT' 'SqlBackend' m () -- asdf = pure () -- -- -- after: -- asdf' :: 'BackendCompatible' SqlBackend backend => ReaderT backend m () -- asdf' = withReaderT 'projectBackend' asdf -- @ -- -- @since 2.7.1 class BackendCompatible sup sub where projectBackend :: sub -> sup -- | A convenient alias for common type signatures type PersistRecordBackend record backend = (PersistEntity record, PersistEntityBackend record ~ BaseBackend backend) liftPersist :: (MonadIO m, MonadReader backend m) => ReaderT backend IO b -> m b liftPersist f = do env <- ask liftIO $ runReaderT f env -- | 'ToBackendKey' converts a 'PersistEntity' 'Key' into a 'BackendKey' -- This can be used by each backend to convert between a 'Key' and a plain -- Haskell type. For Sql, that is done with 'toSqlKey' and 'fromSqlKey'. -- -- By default, a 'PersistEntity' uses the default 'BackendKey' for its Key -- and is an instance of ToBackendKey -- -- A 'Key' that instead uses a custom type will not be an instance of -- 'ToBackendKey'. class ( PersistEntity record , PersistEntityBackend record ~ backend , PersistCore backend ) => ToBackendKey backend record where toBackendKey :: Key record -> BackendKey backend fromBackendKey :: BackendKey backend -> Key record class PersistCore backend where data BackendKey backend class ( Show (BackendKey backend), Read (BackendKey backend) , Eq (BackendKey backend), Ord (BackendKey backend) , PersistCore backend , PersistField (BackendKey backend), A.ToJSON (BackendKey backend), A.FromJSON (BackendKey backend) ) => PersistStoreRead backend where -- | Get a record by identifier, if available. -- -- === __Example usage__ -- -- With <#schema-persist-store-1 schema-1> and <#dataset-persist-store-1 dataset-1>, -- -- > getSpj :: MonadIO m => ReaderT SqlBackend m (Maybe User) -- > getSpj = get spjId -- -- > mspj <- getSpj -- -- The above query when applied on <#dataset-persist-store-1 dataset-1>, will get this: -- -- > +------+-----+ -- > | name | age | -- > +------+-----+ -- > | SPJ | 40 | -- > +------+-----+ get :: (MonadIO m, PersistRecordBackend record backend) => Key record -> ReaderT backend m (Maybe record) -- | Get many records by their respective identifiers, if available. -- -- @since 2.8.1 -- -- === __Example usage__ -- -- With <#schema-persist-store-1 schema-1> and <#dataset-persist-store-1 dataset-1>: -- -- > getUsers :: MonadIO m => ReaderT SqlBackend m (Map (Key User) User) -- > getUsers = getMany allkeys -- -- > musers <- getUsers -- -- The above query when applied on <#dataset-persist-store-1 dataset-1>, will get these records: -- -- > +----+-------+-----+ -- > | id | name | age | -- > +----+-------+-----+ -- > | 1 | SPJ | 40 | -- > +----+-------+-----+ -- > | 2 | Simon | 41 | -- > +----+-------+-----+ getMany :: (MonadIO m, PersistRecordBackend record backend) => [Key record] -> ReaderT backend m (Map (Key record) record) getMany [] = return Map.empty getMany ks = do vs <- mapM get ks let kvs = zip ks vs let kvs' = (fmap Maybe.fromJust) `fmap` filter (\(_,v) -> Maybe.isJust v) kvs return $ Map.fromList kvs' class ( Show (BackendKey backend), Read (BackendKey backend) , Eq (BackendKey backend), Ord (BackendKey backend) , PersistStoreRead backend , PersistField (BackendKey backend), A.ToJSON (BackendKey backend), A.FromJSON (BackendKey backend) ) => PersistStoreWrite backend where -- | Create a new record in the database, returning an automatically created -- key (in SQL an auto-increment id). -- -- === __Example usage__ -- -- Using <#schema-persist-store-1 schema-1> and <#dataset-persist-store-1 dataset-1>, let's insert a new user 'John'. -- -- > insertJohn :: MonadIO m => ReaderT SqlBackend m (Key User) -- > insertJohn = insert $ User "John" 30 -- -- > johnId <- insertJohn -- -- The above query when applied on <#dataset-persist-store-1 dataset-1>, will produce this: -- -- > +-----+------+-----+ -- > |id |name |age | -- > +-----+------+-----+ -- > |1 |SPJ |40 | -- > +-----+------+-----+ -- > |2 |Simon |41 | -- > +-----+------+-----+ -- > |3 |John |30 | -- > +-----+------+-----+ insert :: (MonadIO m, PersistRecordBackend record backend) => record -> ReaderT backend m (Key record) -- | Same as 'insert', but doesn't return a @Key@. -- -- === __Example usage__ -- -- with <#schema-persist-store-1 schema-1> and <#dataset-persist-store-1 dataset-1>, -- -- > insertJohn :: MonadIO m => ReaderT SqlBackend m (Key User) -- > insertJohn = insert_ $ User "John" 30 -- -- The above query when applied on <#dataset-persist-store-1 dataset-1>, will produce this: -- -- > +-----+------+-----+ -- > |id |name |age | -- > +-----+------+-----+ -- > |1 |SPJ |40 | -- > +-----+------+-----+ -- > |2 |Simon |41 | -- > +-----+------+-----+ -- > |3 |John |30 | -- > +-----+------+-----+ insert_ :: (MonadIO m, PersistRecordBackend record backend) => record -> ReaderT backend m () insert_ record = insert record >> return () -- | Create multiple records in the database and return their 'Key's. -- -- If you don't need the inserted 'Key's, use 'insertMany_'. -- -- The MongoDB and PostgreSQL backends insert all records and -- retrieve their keys in one database query. -- -- The SQLite and MySQL backends use the slow, default implementation of -- @mapM insert@. -- -- === __Example usage__ -- -- with <#schema-persist-store-1 schema-1> and <#dataset-persist-store-1 dataset-1>, -- -- > insertUsers :: MonadIO m => ReaderT SqlBackend m [Key User] -- > insertUsers = insertMany [User "John" 30, User "Nick" 32, User "Jane" 20] -- -- > userIds <- insertUsers -- -- The above query when applied on <#dataset-persist-store-1 dataset-1>, will produce this: -- -- > +-----+------+-----+ -- > |id |name |age | -- > +-----+------+-----+ -- > |1 |SPJ |40 | -- > +-----+------+-----+ -- > |2 |Simon |41 | -- > +-----+------+-----+ -- > |3 |John |30 | -- > +-----+------+-----+ -- > |4 |Nick |32 | -- > +-----+------+-----+ -- > |5 |Jane |20 | -- > +-----+------+-----+ insertMany :: (MonadIO m, PersistRecordBackend record backend) => [record] -> ReaderT backend m [Key record] insertMany = mapM insert -- | Same as 'insertMany', but doesn't return any 'Key's. -- -- The MongoDB, PostgreSQL, SQLite and MySQL backends insert all records in -- one database query. -- -- === __Example usage__ -- -- With <#schema-persist-store-1 schema-1> and <#dataset-persist-store-1 dataset-1>, -- -- > insertUsers_ :: MonadIO m => ReaderT SqlBackend m () -- > insertUsers_ = insertMany_ [User "John" 30, User "Nick" 32, User "Jane" 20] -- -- The above query when applied on <#dataset-persist-store-1 dataset-1>, will produce this: -- -- > +-----+------+-----+ -- > |id |name |age | -- > +-----+------+-----+ -- > |1 |SPJ |40 | -- > +-----+------+-----+ -- > |2 |Simon |41 | -- > +-----+------+-----+ -- > |3 |John |30 | -- > +-----+------+-----+ -- > |4 |Nick |32 | -- > +-----+------+-----+ -- > |5 |Jane |20 | -- > +-----+------+-----+ insertMany_ :: (MonadIO m, PersistRecordBackend record backend) => [record] -> ReaderT backend m () insertMany_ x = insertMany x >> return () -- | Same as 'insertMany_', but takes an 'Entity' instead of just a record. -- -- Useful when migrating data from one entity to another -- and want to preserve ids. -- -- The MongoDB, PostgreSQL, SQLite and MySQL backends insert all records in -- one database query. -- -- === __Example usage__ -- -- With <#schema-persist-store-1 schema-1> and <#dataset-persist-store-1 dataset-1>, -- -- > insertUserEntityMany :: MonadIO m => ReaderT SqlBackend m () -- > insertUserEntityMany = insertEntityMany [SnakeEntity, EvaEntity] -- -- The above query when applied on <#dataset-persist-store-1 dataset-1>, will produce this: -- -- > +-----+------+-----+ -- > |id |name |age | -- > +-----+------+-----+ -- > |1 |SPJ |40 | -- > +-----+------+-----+ -- > |2 |Simon |41 | -- > +-----+------+-----+ -- > |3 |Snake |38 | -- > +-----+------+-----+ -- > |4 |Eva |38 | -- > +-----+------+-----+ insertEntityMany :: (MonadIO m, PersistRecordBackend record backend) => [Entity record] -> ReaderT backend m () insertEntityMany = mapM_ (\(Entity k record) -> insertKey k record) -- | Create a new record in the database using the given key. -- -- === __Example usage__ -- With <#schema-persist-store-1 schema-1> and <#dataset-persist-store-1 dataset-1>, -- -- > insertAliceKey :: MonadIO m => Key User -> ReaderT SqlBackend m () -- > insertAliceKey key = insertKey key $ User "Alice" 20 -- -- > insertAliceKey $ UserKey {unUserKey = SqlBackendKey {unSqlBackendKey = 3}} -- -- The above query when applied on <#dataset-persist-store-1 dataset-1>, will produce this: -- -- > +-----+------+-----+ -- > |id |name |age | -- > +-----+------+-----+ -- > |1 |SPJ |40 | -- > +-----+------+-----+ -- > |2 |Simon |41 | -- > +-----+------+-----+ -- > |3 |Alice |20 | -- > +-----+------+-----+ insertKey :: (MonadIO m, PersistRecordBackend record backend) => Key record -> record -> ReaderT backend m () -- | Put the record in the database with the given key. -- Unlike 'replace', if a record with the given key does not -- exist then a new record will be inserted. -- -- === __Example usage__ -- -- We try to explain 'upsertBy' using <#schema-persist-store-1 schema-1> and <#dataset-persist-store-1 dataset-1>. -- -- First, we insert Philip to <#dataset-persist-store-1 dataset-1>. -- -- > insertPhilip :: MonadIO m => ReaderT SqlBackend m (Key User) -- > insertPhilip = insert $ User "Philip" 42 -- -- > philipId <- insertPhilip -- -- This query will produce: -- -- > +-----+------+-----+ -- > |id |name |age | -- > +-----+------+-----+ -- > |1 |SPJ |40 | -- > +-----+------+-----+ -- > |2 |Simon |41 | -- > +-----+------+-----+ -- > |3 |Philip|42 | -- > +-----+------+-----+ -- -- > repsertHaskell :: MonadIO m => Key record -> ReaderT SqlBackend m () -- > repsertHaskell id = repsert id $ User "Haskell" 81 -- -- > repsertHaskell philipId -- -- This query will replace Philip's record with Haskell's one: -- -- > +-----+-----------------+--------+ -- > |id |name |age | -- > +-----+-----------------+--------+ -- > |1 |SPJ |40 | -- > +-----+-----------------+--------+ -- > |2 |Simon |41 | -- > +-----+-----------------+--------+ -- > |3 |Philip -> Haskell|42 -> 81| -- > +-----+-----------------+--------+ -- -- 'repsert' inserts the given record if the key doesn't exist. -- -- > repsertXToUnknown :: MonadIO m => ReaderT SqlBackend m () -- > repsertXToUnknown = repsert unknownId $ User "X" 999 -- -- For example, applying the above query to <#dataset-persist-store-1 dataset-1> will produce this: -- -- > +-----+------+-----+ -- > |id |name |age | -- > +-----+------+-----+ -- > |1 |SPJ |40 | -- > +-----+------+-----+ -- > |2 |Simon |41 | -- > +-----+------+-----+ -- > |3 |X |999 | -- > +-----+------+-----+ repsert :: (MonadIO m, PersistRecordBackend record backend) => Key record -> record -> ReaderT backend m () -- | Put many entities into the database. -- -- Batch version of 'repsert' for SQL backends. -- -- Useful when migrating data from one entity to another -- and want to preserve ids. -- -- @since 2.8.1 -- -- === __Example usage__ -- -- With <#schema-persist-store-1 schema-1> and <#dataset-persist-store-1 dataset-1>, -- -- > repsertManyUsers :: MonadIO m =>ReaderT SqlBackend m () -- > repsertManyusers = repsertMany [(simonId, User "Philip" 20), (unknownId999, User "Mr. X" 999)] -- -- The above query when applied on <#dataset-persist-store-1 dataset-1>, will produce this: -- -- > +-----+----------------+---------+ -- > |id |name |age | -- > +-----+----------------+---------+ -- > |1 |SPJ |40 | -- > +-----+----------------+---------+ -- > |2 |Simon -> Philip |41 -> 20 | -- > +-----+----------------+---------+ -- > |999 |Mr. X |999 | -- > +-----+----------------+---------+ repsertMany :: (MonadIO m, PersistRecordBackend record backend) => [(Key record, record)] -> ReaderT backend m () repsertMany = mapM_ (uncurry repsert) -- | Replace the record in the database with the given -- key. Note that the result is undefined if such record does -- not exist, so you must use 'insertKey' or 'repsert' in -- these cases. -- -- === __Example usage__ -- -- With <#schema-persist-store-1 schema-1 schama-1> and <#dataset-persist-store-1 dataset-1>, -- -- > replaceSpj :: MonadIO m => User -> ReaderT SqlBackend m () -- > replaceSpj record = replace spjId record -- -- The above query when applied on <#dataset-persist-store-1 dataset-1>, will produce this: -- -- > +-----+------+-----+ -- > |id |name |age | -- > +-----+------+-----+ -- > |1 |Mike |45 | -- > +-----+------+-----+ -- > |2 |Simon |41 | -- > +-----+------+-----+ replace :: (MonadIO m, PersistRecordBackend record backend) => Key record -> record -> ReaderT backend m () -- | Delete a specific record by identifier. Does nothing if record does -- not exist. -- -- === __Example usage__ -- -- With <#schema-persist-store-1 schema-1> and <#dataset-persist-store-1 dataset-1>, -- -- > deleteSpj :: MonadIO m => ReaderT SqlBackend m () -- > deleteSpj = delete spjId -- -- The above query when applied on <#dataset-persist-store-1 dataset-1>, will produce this: -- -- > +-----+------+-----+ -- > |id |name |age | -- > +-----+------+-----+ -- > |2 |Simon |41 | -- > +-----+------+-----+ delete :: (MonadIO m, PersistRecordBackend record backend) => Key record -> ReaderT backend m () -- | Update individual fields on a specific record. -- -- === __Example usage__ -- -- With <#schema-persist-store-1 schema-1> and <#dataset-persist-store-1 dataset-1>, -- -- > updateSpj :: MonadIO m => [Update User] -> ReaderT SqlBackend m () -- > updateSpj updates = update spjId updates -- -- > updateSpj [UserAge +=. 100] -- -- The above query when applied on <#dataset-persist-store-1 dataset-1>, will produce this: -- -- > +-----+------+-----+ -- > |id |name |age | -- > +-----+------+-----+ -- > |1 |SPJ |140 | -- > +-----+------+-----+ -- > |2 |Simon |41 | -- > +-----+------+-----+ update :: (MonadIO m, PersistRecordBackend record backend) => Key record -> [Update record] -> ReaderT backend m () -- | Update individual fields on a specific record, and retrieve the -- updated value from the database. -- -- Note that this function will throw an exception if the given key is not -- found in the database. -- -- === __Example usage__ -- -- With <#schema-persist-store-1 schema-1> and <#dataset-persist-store-1 dataset-1>, -- -- > updateGetSpj :: MonadIO m => [Update User] -> ReaderT SqlBackend m User -- > updateGetSpj updates = updateGet spjId updates -- -- > spj <- updateGetSpj [UserAge +=. 100] -- -- The above query when applied on <#dataset-persist-store-1 dataset-1>, will produce this: -- -- > +-----+------+-----+ -- > |id |name |age | -- > +-----+------+-----+ -- > |1 |SPJ |140 | -- > +-----+------+-----+ -- > |2 |Simon |41 | -- > +-----+------+-----+ updateGet :: (MonadIO m, PersistRecordBackend record backend) => Key record -> [Update record] -> ReaderT backend m record updateGet key ups = do update key ups get key >>= maybe (liftIO $ throwIO $ KeyNotFound $ show key) return -- | Same as 'get', but for a non-null (not Maybe) foreign key. -- Unsafe unless your database is enforcing that the foreign key is valid. -- -- === __Example usage__ -- -- With <#schema-persist-store-1 schema-1> and <#dataset-persist-store-1 dataset-1>, -- -- > getJustSpj :: MonadIO m => ReaderT SqlBackend m User -- > getJustSpj = getJust spjId -- -- > spj <- getJust spjId -- -- The above query when applied on <#dataset-persist-store-1 dataset-1>, will get this record: -- -- > +----+------+-----+ -- > | id | name | age | -- > +----+------+-----+ -- > | 1 | SPJ | 40 | -- > +----+------+-----+ -- -- > getJustUnknown :: MonadIO m => ReaderT SqlBackend m User -- > getJustUnknown = getJust unknownId -- -- mrx <- getJustUnknown -- -- This just throws an error. getJust :: ( PersistStoreRead backend , PersistRecordBackend record backend , MonadIO m ) => Key record -> ReaderT backend m record getJust key = get key >>= maybe (liftIO $ throwIO $ PersistForeignConstraintUnmet $ T.pack $ show key) return -- | Same as 'getJust', but returns an 'Entity' instead of just the record. -- -- @since 2.6.1 -- -- === __Example usage__ -- -- With <#schema-persist-store-1 schema-1> and <#dataset-persist-store-1 dataset-1>, -- -- > getJustEntitySpj :: MonadIO m => ReaderT SqlBackend m (Entity User) -- > getJustEntitySpj = getJustEntity spjId -- -- > spjEnt <- getJustEntitySpj -- -- The above query when applied on <#dataset-persist-store-1 dataset-1>, will get this entity: -- -- > +----+------+-----+ -- > | id | name | age | -- > +----+------+-----+ -- > | 1 | SPJ | 40 | -- > +----+------+-----+ getJustEntity :: (PersistEntityBackend record ~ BaseBackend backend ,MonadIO m ,PersistEntity record ,PersistStoreRead backend) => Key record -> ReaderT backend m (Entity record) getJustEntity key = do record <- getJust key return $ Entity { entityKey = key , entityVal = record } -- | Curry this to make a convenience function that loads an associated model. -- -- > foreign = belongsTo foreignId belongsTo :: ( PersistStoreRead backend , PersistEntity ent1 , PersistRecordBackend ent2 backend , MonadIO m ) => (ent1 -> Maybe (Key ent2)) -> ent1 -> ReaderT backend m (Maybe ent2) belongsTo foreignKeyField model = case foreignKeyField model of Nothing -> return Nothing Just f -> get f -- | Same as 'belongsTo', but uses @getJust@ and therefore is similarly unsafe. belongsToJust :: ( PersistStoreRead backend , PersistEntity ent1 , PersistRecordBackend ent2 backend , MonadIO m ) => (ent1 -> Key ent2) -> ent1 -> ReaderT backend m ent2 belongsToJust getForeignKey model = getJust $ getForeignKey model -- | Like @insert@, but returns the complete @Entity@. -- -- === __Example usage__ -- -- With <#schema-persist-store-1 schema-1> and <#dataset-persist-store-1 dataset-1>, -- -- > insertHaskellEntity :: MonadIO m => ReaderT SqlBackend m (Entity User) -- > insertHaskellEntity = insertEntity $ User "Haskell" 81 -- -- > haskellEnt <- insertHaskellEntity -- -- The above query when applied on <#dataset-persist-store-1 dataset-1>, will produce this: -- -- > +----+---------+-----+ -- > | id | name | age | -- > +----+---------+-----+ -- > | 1 | SPJ | 40 | -- > +----+---------+-----+ -- > | 2 | Simon | 41 | -- > +----+---------+-----+ -- > | 3 | Haskell | 81 | -- > +----+---------+-----+ insertEntity :: ( PersistStoreWrite backend , PersistRecordBackend e backend , MonadIO m ) => e -> ReaderT backend m (Entity e) insertEntity e = do eid <- insert e return $ Entity eid e -- | Like @get@, but returns the complete @Entity@. -- -- === __Example usage__ -- -- With <#schema-persist-store-1 schema-1> and <#dataset-persist-store-1 dataset-1>, -- -- > getSpjEntity :: MonadIO m => ReaderT SqlBackend m (Maybe (Entity User)) -- > getSpjEntity = getEntity spjId -- -- > mSpjEnt <- getSpjEntity -- -- The above query when applied on <#dataset-persist-store-1 dataset-1>, will get this entity: -- -- > +----+------+-----+ -- > | id | name | age | -- > +----+------+-----+ -- > | 1 | SPJ | 40 | -- > +----+------+-----+ getEntity :: ( PersistStoreRead backend , PersistRecordBackend e backend , MonadIO m ) => Key e -> ReaderT backend m (Maybe (Entity e)) getEntity key = do maybeModel <- get key return $ fmap (key `Entity`) maybeModel -- | Like 'insertEntity' but just returns the record instead of 'Entity'. -- -- @since 2.6.1 -- -- === __Example usage__ -- -- With <#schema-persist-store-1 schema-1> and <#dataset-persist-store-1 dataset-1>, -- -- > insertDaveRecord :: MonadIO m => ReaderT SqlBackend m User -- > insertDaveRecord = insertRecord $ User "Dave" 50 -- -- > dave <- insertDaveRecord -- -- The above query when applied on <#dataset-persist-store-1 dataset-1>, will produce this: -- -- > +-----+------+-----+ -- > |id |name |age | -- > +-----+------+-----+ -- > |1 |SPJ |40 | -- > +-----+------+-----+ -- > |2 |Simon |41 | -- > +-----+------+-----+ -- > |3 |Dave |50 | -- > +-----+------+-----+ insertRecord :: (PersistEntityBackend record ~ BaseBackend backend ,PersistEntity record ,MonadIO m ,PersistStoreWrite backend) => record -> ReaderT backend m record insertRecord record = do insert_ record return $ record persistent-2.10.5.2/Database/Persist/Sql/Migration.hs0000644000000000000000000001651113554340146020517 0ustar0000000000000000module Database.Persist.Sql.Migration ( parseMigration , parseMigration' , printMigration , showMigration , getMigration , runMigration , runMigrationQuiet , runMigrationSilent , runMigrationUnsafe , runMigrationUnsafeQuiet , migrate -- * Utilities for constructing migrations , reportErrors , reportError , addMigrations , addMigration ) where import Control.Exception (throwIO) import Control.Monad (liftM, unless) import Control.Monad.IO.Unlift import Control.Monad.Trans.Class (MonadTrans (..)) import Control.Monad.Trans.Reader (ReaderT (..), ask) import Control.Monad.Trans.Writer import Data.Text (Text, unpack, snoc, isPrefixOf, pack) import qualified Data.Text.IO import System.IO import System.IO.Silently (hSilence) import Database.Persist.Sql.Types import Database.Persist.Sql.Raw import Database.Persist.Types import Database.Persist.Sql.Orphan.PersistStore() allSql :: CautiousMigration -> [Sql] allSql = map snd safeSql :: CautiousMigration -> [Sql] safeSql = allSql . filter (not . fst) -- | Given a 'Migration', this parses it and returns either a list of -- errors associated with the migration or a list of migrations to do. parseMigration :: MonadIO m => Migration -> ReaderT SqlBackend m (Either [Text] CautiousMigration) parseMigration = liftIOReader . liftM go . runWriterT . execWriterT where go ([], sql) = Right sql go (errs, _) = Left errs liftIOReader (ReaderT m) = ReaderT $ liftIO . m -- | Like 'parseMigration', but instead of returning the value in an -- 'Either' value, it calls 'error' on the error values. parseMigration' :: MonadIO m => Migration -> ReaderT SqlBackend m (CautiousMigration) parseMigration' m = do x <- parseMigration m case x of Left errs -> error $ unlines $ map unpack errs Right sql -> return sql -- | Prints a migration. printMigration :: MonadIO m => Migration -> ReaderT SqlBackend m () printMigration m = showMigration m >>= mapM_ (liftIO . Data.Text.IO.putStrLn) -- | Convert a 'Migration' to a list of 'Text' values corresponding to their -- 'Sql' statements. showMigration :: MonadIO m => Migration -> ReaderT SqlBackend m [Text] showMigration m = map (flip snoc ';') `liftM` getMigration m -- | Return all of the 'Sql' values associated with the given migration. -- Calls 'error' if there's a parse error on any migration. getMigration :: MonadIO m => Migration -> ReaderT SqlBackend m [Sql] getMigration m = do mig <- parseMigration' m return $ allSql mig -- | Runs a migration. If the migration fails to parse or if any of the -- migrations are unsafe, then this calls 'error' to halt the program. runMigration :: MonadIO m => Migration -> ReaderT SqlBackend m () runMigration m = runMigration' m False >> return () -- | Same as 'runMigration', but does not report the individual migrations on -- stderr. Instead it returns a list of the executed SQL commands. -- -- This is a safer/more robust alternative to 'runMigrationSilent', but may be -- less silent for some persistent implementations, most notably -- persistent-postgresql -- -- @since 2.10.2 runMigrationQuiet :: MonadIO m => Migration -> ReaderT SqlBackend m [Text] runMigrationQuiet m = runMigration' m True -- | Same as 'runMigration', but returns a list of the SQL commands executed -- instead of printing them to stderr. -- -- This function silences the migration by remapping 'stderr'. As a result, it -- is not thread-safe and can clobber output from other parts of the program. -- This implementation method was chosen to also silence postgresql migration -- output on stderr, but is not recommended! runMigrationSilent :: MonadUnliftIO m => Migration -> ReaderT SqlBackend m [Text] runMigrationSilent m = withRunInIO $ \run -> hSilence [stderr] $ run $ runMigration' m True -- | Run the given migration against the database. If the migration fails -- to parse, or there are any unsafe migrations, then this will error at -- runtime. This returns a list of the migrations that were executed. runMigration' :: MonadIO m => Migration -> Bool -- ^ is silent? -> ReaderT SqlBackend m [Text] runMigration' m silent = do mig <- parseMigration' m if any fst mig then liftIO . throwIO . PersistError . pack $ concat [ "\n\nDatabase migration: manual intervention required.\n" , "The unsafe actions are prefixed by '***' below:\n\n" , unlines $ map displayMigration mig ] else mapM (executeMigrate silent) $ sortMigrations $ safeSql mig where displayMigration :: (Bool, Sql) -> String displayMigration (True, s) = "*** " ++ unpack s ++ ";" displayMigration (False, s) = " " ++ unpack s ++ ";" -- | Like 'runMigration', but this will perform the unsafe database -- migrations instead of erroring out. runMigrationUnsafe :: MonadIO m => Migration -> ReaderT SqlBackend m () runMigrationUnsafe m = runMigrationUnsafe' False m >> return () -- | Same as 'runMigrationUnsafe', but returns a list of the SQL commands -- executed instead of printing them to stderr. -- -- @since 2.10.2 runMigrationUnsafeQuiet :: MonadIO m => Migration -> ReaderT SqlBackend m [Text] runMigrationUnsafeQuiet = runMigrationUnsafe' True runMigrationUnsafe' :: MonadIO m => Bool -> Migration -> ReaderT SqlBackend m [Text] runMigrationUnsafe' silent m = do mig <- parseMigration' m mapM (executeMigrate silent) $ sortMigrations $ allSql mig executeMigrate :: MonadIO m => Bool -> Text -> ReaderT SqlBackend m Text executeMigrate silent s = do unless silent $ liftIO $ hPutStrLn stderr $ "Migrating: " ++ unpack s rawExecute s [] return s -- | Sort the alter DB statements so tables are created before constraints are -- added. sortMigrations :: [Sql] -> [Sql] sortMigrations x = filter isCreate x ++ filter (not . isCreate) x where -- Note the use of lower-case e. This (hack) allows backends to explicitly -- choose to have this special sorting applied. isCreate t = pack "CREATe " `isPrefixOf` t -- | Given a list of old entity definitions and a new 'EntityDef' in -- @val@, this creates a 'Migration' to update the old list of definitions -- with the new one. migrate :: [EntityDef] -> EntityDef -> Migration migrate allDefs val = do conn <- lift $ lift ask res <- liftIO $ connMigrateSql conn allDefs (getStmtConn conn) val either reportErrors addMigrations res -- | Report a single error in a 'Migration'. -- -- @since 2.9.2 reportError :: Text -> Migration reportError = tell . pure -- | Report multiple errors in a 'Migration'. -- -- @since 2.9.2 reportErrors :: [Text] -> Migration reportErrors = tell -- | Add a migration to the migration plan. -- -- @since 2.9.2 addMigration :: Bool -- ^ Is the migration safe to run? (eg a non-destructive and idempotent -- update on the schema) -> Sql -- ^ A 'Text' value representing the command to run on the database. -> Migration addMigration isSafe sql = lift (tell [(isSafe, sql)]) -- | Add a 'CautiousMigration' (aka a @[('Bool', 'Text')]@) to the -- migration plan. -- -- @since 2.9.2 addMigrations :: CautiousMigration -> Migration addMigrations = lift . tell persistent-2.10.5.2/Database/Persist/Sql/Internal.hs0000644000000000000000000000513713523055753020347 0ustar0000000000000000{-# LANGUAGE PatternGuards #-} -- | Intended for creating new backends. module Database.Persist.Sql.Internal ( mkColumns , defaultAttribute ) where import Data.Char (isSpace) import Data.Monoid (mappend, mconcat) import Data.Text (Text) import qualified Data.Text as T import Database.Persist.Quasi import Database.Persist.Sql.Types import Database.Persist.Types defaultAttribute :: [Attr] -> Maybe Text defaultAttribute [] = Nothing defaultAttribute (a:as) | Just d <- T.stripPrefix "default=" a = Just d | otherwise = defaultAttribute as -- | Create the list of columns for the given entity. mkColumns :: [EntityDef] -> EntityDef -> ([Column], [UniqueDef], [ForeignDef]) mkColumns allDefs t = (cols, entityUniques t, entityForeigns t) where cols :: [Column] cols = map go (entityFields t) tn :: DBName tn = entityDB t go :: FieldDef -> Column go fd = Column (fieldDB fd) (nullable (fieldAttrs fd) /= NotNullable || entitySum t) (fieldSqlType fd) (defaultAttribute $ fieldAttrs fd) Nothing (maxLen $ fieldAttrs fd) (ref (fieldDB fd) (fieldReference fd) (fieldAttrs fd)) maxLen :: [Attr] -> Maybe Integer maxLen [] = Nothing maxLen (a:as) | Just d <- T.stripPrefix "maxlen=" a = case reads (T.unpack d) of [(i, s)] | all isSpace s -> Just i _ -> error $ "Could not parse maxlen field with value " ++ show d ++ " on " ++ show tn | otherwise = maxLen as ref :: DBName -> ReferenceDef -> [Attr] -> Maybe (DBName, DBName) -- table name, constraint name ref c fe [] | ForeignRef f _ <- fe = Just (resolveTableName allDefs f, refName tn c) | otherwise = Nothing ref _ _ ("noreference":_) = Nothing ref c fe (a:as) | Just x <- T.stripPrefix "reference=" a = do constraintName <- snd <$> (ref c fe as) pure (DBName x, constraintName) | Just x <- T.stripPrefix "constraint=" a = do tableName <- fst <$> (ref c fe as) pure (tableName, DBName x) ref c x (_:as) = ref c x as refName :: DBName -> DBName -> DBName refName (DBName table) (DBName column) = DBName $ Data.Monoid.mconcat [table, "_", column, "_fkey"] resolveTableName :: [EntityDef] -> HaskellName -> DBName resolveTableName [] (HaskellName hn) = error $ "Table not found: " `Data.Monoid.mappend` T.unpack hn resolveTableName (e:es) hn | entityHaskell e == hn = entityDB e | otherwise = resolveTableName es hn persistent-2.10.5.2/Database/Persist/Sql/Types.hs0000644000000000000000000000752213620641325017671 0ustar0000000000000000module Database.Persist.Sql.Types ( module Database.Persist.Sql.Types , SqlBackend (..), SqlReadBackend (..), SqlWriteBackend (..) , Statement (..), LogFunc, InsertSqlResult (..) , readToUnknown, readToWrite, writeToUnknown , SqlBackendCanRead, SqlBackendCanWrite, SqlReadT, SqlWriteT, IsSqlBackend ) where import Control.Exception (Exception) import Control.Monad.Logger (NoLoggingT) import Control.Monad.Trans.Reader (ReaderT (..)) import Control.Monad.Trans.Resource (ResourceT) import Control.Monad.Trans.Writer (WriterT) import Data.Pool (Pool) import Data.Text (Text) import Data.Typeable (Typeable) import Database.Persist.Types import Database.Persist.Sql.Types.Internal data Column = Column { cName :: !DBName , cNull :: !Bool , cSqlType :: !SqlType , cDefault :: !(Maybe Text) , cDefaultConstraintName :: !(Maybe DBName) , cMaxLen :: !(Maybe Integer) , cReference :: !(Maybe (DBName, DBName)) -- table name, constraint name } deriving (Eq, Ord, Show) data PersistentSqlException = StatementAlreadyFinalized Text | Couldn'tGetSQLConnection deriving (Typeable, Show) instance Exception PersistentSqlException type SqlPersistT = ReaderT SqlBackend type SqlPersistM = SqlPersistT (NoLoggingT (ResourceT IO)) type Sql = Text -- Bool indicates if the Sql is safe type CautiousMigration = [(Bool, Sql)] -- | A 'Migration' is a four level monad stack consisting of: -- -- * @'WriterT' ['Text']@ representing a log of errors in the migrations. -- * @'WriterT' 'CautiousMigration'@ representing a list of migrations to -- run, along with whether or not they are safe. -- * @'ReaderT' 'SqlBackend'@, aka the 'SqlPersistT' transformer for -- database interop. -- * @'IO'@ for arbitrary IO. type Migration = WriterT [Text] (WriterT CautiousMigration (ReaderT SqlBackend IO)) () type ConnectionPool = Pool SqlBackend -- $rawSql -- -- Although it covers most of the useful cases, @persistent@'s -- API may not be enough for some of your tasks. May be you need -- some complex @JOIN@ query, or a database-specific command -- needs to be issued. -- -- To issue raw SQL queries, use 'rawSql'. It does all the hard work of -- automatically parsing the rows of the result. It may return: -- -- * An 'Entity', that which 'selectList' returns. -- All of your entity's fields are -- automatically parsed. -- -- * A @'Single' a@, which is a single, raw column of type @a@. -- You may use a Haskell type (such as in your entity -- definitions), for example @Single Text@ or @Single Int@, -- or you may get the raw column value with @Single -- 'PersistValue'@. -- -- * A tuple combining any of these (including other tuples). -- Using tuples allows you to return many entities in one -- query. -- -- The only difference between issuing SQL queries with 'rawSql' -- and using other means is that we have an /entity selection/ -- /placeholder/, the double question mark @??@. It /must/ be -- used whenever you want to @SELECT@ an 'Entity' from your -- query. Here's a sample SQL query @sampleStmt@ that may be -- issued: -- -- @ -- SELECT ??, ?? -- FROM \"Person\", \"Likes\", \"Object\" -- WHERE \"Person\".id = \"Likes\".\"personId\" -- AND \"Object\".id = \"Likes\".\"objectId\" -- AND \"Person\".name LIKE ? -- @ -- -- To use that query, you could say -- -- @ -- do results <- 'rawSql' sampleStmt [\"%Luke%\"] -- forM_ results $ -- \\( Entity personKey person -- , Entity objectKey object -- ) -> do ... -- @ -- -- Note that 'rawSql' knows how to replace the double question -- marks @??@ because of the type of the @results@. -- | A single column (see 'rawSql'). Any 'PersistField' may be -- used here, including 'PersistValue' (which does not do any -- processing). newtype Single a = Single {unSingle :: a} deriving (Eq, Ord, Show, Read) persistent-2.10.5.2/Database/Persist/Sql/Raw.hs0000644000000000000000000002344013530011755017311 0ustar0000000000000000module Database.Persist.Sql.Raw where import Control.Exception (throwIO) import Control.Monad (when, liftM) import Control.Monad.IO.Class (MonadIO, liftIO) import Control.Monad.Logger (logDebugNS, runLoggingT) import Control.Monad.Reader (ReaderT, ask, MonadReader) import Control.Monad.Trans.Resource (MonadResource,release) import Data.Acquire (allocateAcquire, Acquire, mkAcquire, with) import Data.Conduit import Data.IORef (writeIORef, readIORef, newIORef) import qualified Data.Map as Map import Data.Int (Int64) import Data.Text (Text, pack) import qualified Data.Text as T import Database.Persist import Database.Persist.Sql.Types import Database.Persist.Sql.Class rawQuery :: (MonadResource m, MonadReader env m, BackendCompatible SqlBackend env) => Text -> [PersistValue] -> ConduitM () [PersistValue] m () rawQuery sql vals = do srcRes <- liftPersist $ rawQueryRes sql vals (releaseKey, src) <- allocateAcquire srcRes src release releaseKey rawQueryRes :: (MonadIO m1, MonadIO m2, BackendCompatible SqlBackend env) => Text -> [PersistValue] -> ReaderT env m1 (Acquire (ConduitM () [PersistValue] m2 ())) rawQueryRes sql vals = do conn <- projectBackend `liftM` ask let make = do runLoggingT (logDebugNS (pack "SQL") $ T.append sql $ pack $ "; " ++ show vals) (connLogFunc conn) getStmtConn conn sql return $ do stmt <- mkAcquire make stmtReset stmtQuery stmt vals -- | Execute a raw SQL statement rawExecute :: (MonadIO m, BackendCompatible SqlBackend backend) => Text -- ^ SQL statement, possibly with placeholders. -> [PersistValue] -- ^ Values to fill the placeholders. -> ReaderT backend m () rawExecute x y = liftM (const ()) $ rawExecuteCount x y -- | Execute a raw SQL statement and return the number of -- rows it has modified. rawExecuteCount :: (MonadIO m, BackendCompatible SqlBackend backend) => Text -- ^ SQL statement, possibly with placeholders. -> [PersistValue] -- ^ Values to fill the placeholders. -> ReaderT backend m Int64 rawExecuteCount sql vals = do conn <- projectBackend `liftM` ask runLoggingT (logDebugNS (pack "SQL") $ T.append sql $ pack $ "; " ++ show vals) (connLogFunc conn) stmt <- getStmt sql res <- liftIO $ stmtExecute stmt vals liftIO $ stmtReset stmt return res getStmt :: (MonadIO m, BackendCompatible SqlBackend backend) => Text -> ReaderT backend m Statement getStmt sql = do conn <- projectBackend `liftM` ask liftIO $ getStmtConn conn sql getStmtConn :: SqlBackend -> Text -> IO Statement getStmtConn conn sql = do smap <- liftIO $ readIORef $ connStmtMap conn case Map.lookup sql smap of Just stmt -> return stmt Nothing -> do stmt' <- liftIO $ connPrepare conn sql iactive <- liftIO $ newIORef True let stmt = Statement { stmtFinalize = do active <- readIORef iactive when active $ do stmtFinalize stmt' writeIORef iactive False , stmtReset = do active <- readIORef iactive when active $ stmtReset stmt' , stmtExecute = \x -> do active <- readIORef iactive if active then stmtExecute stmt' x else throwIO $ StatementAlreadyFinalized sql , stmtQuery = \x -> do active <- liftIO $ readIORef iactive if active then stmtQuery stmt' x else liftIO $ throwIO $ StatementAlreadyFinalized sql } liftIO $ writeIORef (connStmtMap conn) $ Map.insert sql stmt smap return stmt -- | Execute a raw SQL statement and return its results as a -- list. If you do not expect a return value, use of -- `rawExecute` is recommended. -- -- If you're using 'Entity'@s@ (which is quite likely), then you -- /must/ use entity selection placeholders (double question -- mark, @??@). These @??@ placeholders are then replaced for -- the names of the columns that we need for your entities. -- You'll receive an error if you don't use the placeholders. -- Please see the 'Entity'@s@ documentation for more details. -- -- You may put value placeholders (question marks, @?@) in your -- SQL query. These placeholders are then replaced by the values -- you pass on the second parameter, already correctly escaped. -- You may want to use 'toPersistValue' to help you constructing -- the placeholder values. -- -- Since you're giving a raw SQL statement, you don't get any -- guarantees regarding safety. If 'rawSql' is not able to parse -- the results of your query back, then an exception is raised. -- However, most common problems are mitigated by using the -- entity selection placeholder @??@, and you shouldn't see any -- error at all if you're not using 'Single'. -- -- Some example of 'rawSql' based on this schema: -- -- @ -- share [mkPersist sqlSettings, mkMigrate "migrateAll"] [persistLowerCase| -- Person -- name String -- age Int Maybe -- deriving Show -- BlogPost -- title String -- authorId PersonId -- deriving Show -- |] -- @ -- -- Examples based on the above schema: -- -- @ -- getPerson :: MonadIO m => ReaderT SqlBackend m [Entity Person] -- getPerson = rawSql "select ?? from person where name=?" [PersistText "john"] -- -- getAge :: MonadIO m => ReaderT SqlBackend m [Single Int] -- getAge = rawSql "select person.age from person where name=?" [PersistText "john"] -- -- getAgeName :: MonadIO m => ReaderT SqlBackend m [(Single Int, Single Text)] -- getAgeName = rawSql "select person.age, person.name from person where name=?" [PersistText "john"] -- -- getPersonBlog :: MonadIO m => ReaderT SqlBackend m [(Entity Person, Entity BlogPost)] -- getPersonBlog = rawSql "select ??,?? from person,blog_post where person.id = blog_post.author_id" [] -- @ -- -- Minimal working program for PostgreSQL backend based on the above concepts: -- -- > {-# LANGUAGE EmptyDataDecls #-} -- > {-# LANGUAGE FlexibleContexts #-} -- > {-# LANGUAGE GADTs #-} -- > {-# LANGUAGE GeneralizedNewtypeDeriving #-} -- > {-# LANGUAGE MultiParamTypeClasses #-} -- > {-# LANGUAGE OverloadedStrings #-} -- > {-# LANGUAGE QuasiQuotes #-} -- > {-# LANGUAGE TemplateHaskell #-} -- > {-# LANGUAGE TypeFamilies #-} -- > -- > import Control.Monad.IO.Class (liftIO) -- > import Control.Monad.Logger (runStderrLoggingT) -- > import Database.Persist -- > import Control.Monad.Reader -- > import Data.Text -- > import Database.Persist.Sql -- > import Database.Persist.Postgresql -- > import Database.Persist.TH -- > -- > share [mkPersist sqlSettings, mkMigrate "migrateAll"] [persistLowerCase| -- > Person -- > name String -- > age Int Maybe -- > deriving Show -- > |] -- > -- > conn = "host=localhost dbname=new_db user=postgres password=postgres port=5432" -- > -- > getPerson :: MonadIO m => ReaderT SqlBackend m [Entity Person] -- > getPerson = rawSql "select ?? from person where name=?" [PersistText "sibi"] -- > -- > liftSqlPersistMPool y x = liftIO (runSqlPersistMPool y x) -- > -- > main :: IO () -- > main = runStderrLoggingT $ withPostgresqlPool conn 10 $ liftSqlPersistMPool $ do -- > runMigration migrateAll -- > xs <- getPerson -- > liftIO (print xs) -- > rawSql :: (RawSql a, MonadIO m, BackendCompatible SqlBackend backend) => Text -- ^ SQL statement, possibly with placeholders. -> [PersistValue] -- ^ Values to fill the placeholders. -> ReaderT backend m [a] rawSql stmt = run where getType :: (x -> m [a]) -> a getType = error "rawSql.getType" x = getType run process = rawSqlProcessRow withStmt' colSubsts params sink = do srcRes <- rawQueryRes sql params liftIO $ with srcRes (\src -> runConduit $ src .| sink) where sql = T.concat $ makeSubsts colSubsts $ T.splitOn placeholder stmt placeholder = "??" makeSubsts (s:ss) (t:ts) = t : s : makeSubsts ss ts makeSubsts [] [] = [] makeSubsts [] ts = [T.intercalate placeholder ts] makeSubsts ss [] = error (concat err) where err = [ "rawsql: there are still ", show (length ss) , "'??' placeholder substitutions to be made " , "but all '??' placeholders have already been " , "consumed. Please read 'rawSql's documentation " , "on how '??' placeholders work." ] run params = do conn <- projectBackend `liftM` ask let (colCount, colSubsts) = rawSqlCols (connEscapeName conn) x withStmt' colSubsts params $ firstRow colCount firstRow colCount = do mrow <- await case mrow of Nothing -> return [] Just row | colCount == length row -> getter mrow | otherwise -> fail $ concat [ "rawSql: wrong number of columns, got " , show (length row), " but expected ", show colCount , " (", rawSqlColCountReason x, ")." ] getter = go id where go acc Nothing = return (acc []) go acc (Just row) = case process row of Left err -> fail (T.unpack err) Right r -> await >>= go (acc . (r:)) persistent-2.10.5.2/Database/Persist/Sql/Run.hs0000644000000000000000000002644713614061357017344 0ustar0000000000000000{-# LANGUAGE ScopedTypeVariables #-} module Database.Persist.Sql.Run where import Control.Exception (bracket, mask, onException) import Control.Monad (liftM) import Control.Monad.IO.Unlift import qualified UnliftIO.Exception as UE import Control.Monad.Logger.CallStack import Control.Monad.Reader (MonadReader) import qualified Control.Monad.Reader as MonadReader import Control.Monad.Trans.Reader hiding (local) import Control.Monad.Trans.Resource import Data.Acquire (Acquire, ReleaseType(..), mkAcquireType, with) import Data.IORef (readIORef) import Data.Pool (Pool, LocalPool) import Data.Pool as P import qualified Data.Map as Map import qualified Data.Text as T import System.Timeout (timeout) import Database.Persist.Class.PersistStore import Database.Persist.Sql.Types import Database.Persist.Sql.Types.Internal (IsolationLevel) import Database.Persist.Sql.Raw -- | The returned 'Acquire' gets a connection from the pool, but does __NOT__ -- start a new transaction. Used to implement 'acquireSqlConnFromPool' and -- 'acquireSqlConnFromPoolWithIsolation', this is useful for performing actions -- on a connection that cannot be done within a transaction, such as VACUUM in -- Sqlite. -- -- @since 2.10.5 unsafeAcquireSqlConnFromPool :: forall backend m . (MonadReader (Pool backend) m, BackendCompatible SqlBackend backend) => m (Acquire backend) unsafeAcquireSqlConnFromPool = do pool <- MonadReader.ask let freeConn :: (backend, LocalPool backend) -> ReleaseType -> IO () freeConn (res, localPool) relType = case relType of ReleaseException -> P.destroyResource pool localPool res _ -> P.putResource localPool res return $ fst <$> mkAcquireType (P.takeResource pool) freeConn -- | The returned 'Acquire' gets a connection from the pool, starts a new -- transaction and gives access to the prepared connection. -- -- When the acquired connection is released the transaction is committed and -- the connection returned to the pool. -- -- Upon an exception the transaction is rolled back and the connection -- destroyed. -- -- This is equivalent to 'runSqlPool' but does not incur the 'MonadUnliftIO' -- constraint, meaning it can be used within, for example, a 'Conduit' -- pipeline. -- -- @since 2.10.5 acquireSqlConnFromPool :: (MonadReader (Pool backend) m, BackendCompatible SqlBackend backend) => m (Acquire backend) acquireSqlConnFromPool = do connFromPool <- unsafeAcquireSqlConnFromPool return $ connFromPool >>= acquireSqlConn -- | Like 'acquireSqlConnFromPool', but lets you specify an explicit isolation -- level. -- -- @since 2.10.5 acquireSqlConnFromPoolWithIsolation :: (MonadReader (Pool backend) m, BackendCompatible SqlBackend backend) => IsolationLevel -> m (Acquire backend) acquireSqlConnFromPoolWithIsolation isolation = do connFromPool <- unsafeAcquireSqlConnFromPool return $ connFromPool >>= acquireSqlConnWithIsolation isolation -- | Get a connection from the pool, run the given action, and then return the -- connection to the pool. -- -- Note: This function previously timed out after 2 seconds, but this behavior -- was buggy and caused more problems than it solved. Since version 2.1.2, it -- performs no timeout checks. runSqlPool :: (MonadUnliftIO m, BackendCompatible SqlBackend backend) => ReaderT backend m a -> Pool backend -> m a runSqlPool r pconn = with (acquireSqlConnFromPool pconn) $ runReaderT r -- | Like 'runSqlPool', but supports specifying an isolation level. -- -- @since 2.9.0 runSqlPoolWithIsolation :: (MonadUnliftIO m, BackendCompatible SqlBackend backend) => ReaderT backend m a -> Pool backend -> IsolationLevel -> m a runSqlPoolWithIsolation r pconn i = with (acquireSqlConnFromPoolWithIsolation i pconn) $ runReaderT r -- | Like 'withResource', but times out the operation if resource -- allocation does not complete within the given timeout period. -- -- @since 2.0.0 withResourceTimeout :: forall a m b. (MonadUnliftIO m) => Int -- ^ Timeout period in microseconds -> Pool a -> (a -> m b) -> m (Maybe b) {-# SPECIALIZE withResourceTimeout :: Int -> Pool a -> (a -> IO b) -> IO (Maybe b) #-} withResourceTimeout ms pool act = withRunInIO $ \runInIO -> mask $ \restore -> do mres <- timeout ms $ takeResource pool case mres of Nothing -> runInIO $ return (Nothing :: Maybe b) Just (resource, local) -> do ret <- restore (runInIO (liftM Just $ act resource)) `onException` destroyResource pool local resource putResource local resource return ret {-# INLINABLE withResourceTimeout #-} rawAcquireSqlConn :: forall backend m . (MonadReader backend m, BackendCompatible SqlBackend backend) => Maybe IsolationLevel -> m (Acquire backend) rawAcquireSqlConn isolation = do conn <- MonadReader.ask let rawConn :: SqlBackend rawConn = projectBackend conn getter :: T.Text -> IO Statement getter = getStmtConn rawConn beginTransaction :: IO backend beginTransaction = conn <$ connBegin rawConn getter isolation finishTransaction :: backend -> ReleaseType -> IO () finishTransaction _ relType = case relType of ReleaseException -> connRollback rawConn getter _ -> connCommit rawConn getter return $ mkAcquireType beginTransaction finishTransaction -- | Starts a new transaction on the connection. When the acquired connection -- is released the transaction is committed and the connection returned to the -- pool. -- -- Upon an exception the transaction is rolled back and the connection -- destroyed. -- -- This is equivalent to 'runSqlConn but does not incur the 'MonadUnliftIO' -- constraint, meaning it can be used within, for example, a 'Conduit' -- pipeline. -- -- @since 2.10.5 acquireSqlConn :: (MonadReader backend m, BackendCompatible SqlBackend backend) => m (Acquire backend) acquireSqlConn = rawAcquireSqlConn Nothing -- | Like 'acquireSqlConn', but lets you specify an explicit isolation level. -- -- @since 2.10.5 acquireSqlConnWithIsolation :: (MonadReader backend m, BackendCompatible SqlBackend backend) => IsolationLevel -> m (Acquire backend) acquireSqlConnWithIsolation = rawAcquireSqlConn . Just runSqlConn :: (MonadUnliftIO m, BackendCompatible SqlBackend backend) => ReaderT backend m a -> backend -> m a runSqlConn r conn = with (acquireSqlConn conn) $ runReaderT r -- | Like 'runSqlConn', but supports specifying an isolation level. -- -- @since 2.9.0 runSqlConnWithIsolation :: (MonadUnliftIO m, BackendCompatible SqlBackend backend) => ReaderT backend m a -> backend -> IsolationLevel -> m a runSqlConnWithIsolation r conn isolation = with (acquireSqlConnWithIsolation isolation conn) $ runReaderT r runSqlPersistM :: (BackendCompatible SqlBackend backend) => ReaderT backend (NoLoggingT (ResourceT IO)) a -> backend -> IO a runSqlPersistM x conn = runResourceT $ runNoLoggingT $ runSqlConn x conn runSqlPersistMPool :: (BackendCompatible SqlBackend backend) => ReaderT backend (NoLoggingT (ResourceT IO)) a -> Pool backend -> IO a runSqlPersistMPool x pool = runResourceT $ runNoLoggingT $ runSqlPool x pool liftSqlPersistMPool :: (MonadIO m, BackendCompatible SqlBackend backend) => ReaderT backend (NoLoggingT (ResourceT IO)) a -> Pool backend -> m a liftSqlPersistMPool x pool = liftIO (runSqlPersistMPool x pool) withSqlPool :: (MonadLogger m, MonadUnliftIO m, BackendCompatible SqlBackend backend) => (LogFunc -> IO backend) -- ^ create a new connection -> Int -- ^ connection count -> (Pool backend -> m a) -> m a withSqlPool mkConn connCount f = withUnliftIO $ \u -> bracket (unliftIO u $ createSqlPool mkConn connCount) destroyAllResources (unliftIO u . f) createSqlPool :: forall m backend. (MonadLogger m, MonadUnliftIO m, BackendCompatible SqlBackend backend) => (LogFunc -> IO backend) -> Int -> m (Pool backend) createSqlPool mkConn size = do logFunc <- askLogFunc -- Resource pool will swallow any exceptions from close. We want to log -- them instead. let loggedClose :: backend -> IO () loggedClose backend = close' backend `UE.catchAny` \e -> runLoggingT (logError $ T.pack $ "Error closing database connection in pool: " ++ show e) logFunc liftIO $ createPool (mkConn logFunc) loggedClose 1 20 size -- NOTE: This function is a terrible, ugly hack. It would be much better to -- just clean up monad-logger. -- -- FIXME: in a future release, switch over to the new askLoggerIO function -- added in monad-logger 0.3.10. That function was not available at the time -- this code was written. askLogFunc :: forall m. (MonadUnliftIO m, MonadLogger m) => m LogFunc askLogFunc = withRunInIO $ \run -> return $ \a b c d -> run (monadLoggerLog a b c d) -- | Create a connection and run sql queries within it. This function -- automatically closes the connection on it's completion. -- -- === __Example usage__ -- -- > {-# LANGUAGE GADTs #-} -- > {-# LANGUAGE ScopedTypeVariables #-} -- > {-# LANGUAGE OverloadedStrings #-} -- > {-# LANGUAGE MultiParamTypeClasses #-} -- > {-# LANGUAGE TypeFamilies#-} -- > {-# LANGUAGE TemplateHaskell#-} -- > {-# LANGUAGE QuasiQuotes#-} -- > {-# LANGUAGE GeneralizedNewtypeDeriving #-} -- > -- > import Control.Monad.IO.Class (liftIO) -- > import Control.Monad.Logger -- > import Conduit -- > import Database.Persist -- > import Database.Sqlite -- > import Database.Persist.Sqlite -- > import Database.Persist.TH -- > -- > share [mkPersist sqlSettings, mkMigrate "migrateAll"] [persistLowerCase| -- > Person -- > name String -- > age Int Maybe -- > deriving Show -- > |] -- > -- > openConnection :: LogFunc -> IO SqlBackend -- > openConnection logfn = do -- > conn <- open "/home/sibi/test.db" -- > wrapConnection conn logfn -- > -- > main :: IO () -- > main = do -- > runNoLoggingT $ runResourceT $ withSqlConn openConnection (\backend -> -- > flip runSqlConn backend $ do -- > runMigration migrateAll -- > insert_ $ Person "John doe" $ Just 35 -- > insert_ $ Person "Divya" $ Just 36 -- > (pers :: [Entity Person]) <- selectList [] [] -- > liftIO $ print pers -- > return () -- > ) -- -- On executing it, you get this output: -- -- > Migrating: CREATE TABLE "person"("id" INTEGER PRIMARY KEY,"name" VARCHAR NOT NULL,"age" INTEGER NULL) -- > [Entity {entityKey = PersonKey {unPersonKey = SqlBackendKey {unSqlBackendKey = 1}}, entityVal = Person {personName = "John doe", personAge = Just 35}},Entity {entityKey = PersonKey {unPersonKey = SqlBackendKey {unSqlBackendKey = 2}}, entityVal = Person {personName = "Hema", personAge = Just 36}}] -- withSqlConn :: (MonadUnliftIO m, MonadLogger m, BackendCompatible SqlBackend backend) => (LogFunc -> IO backend) -> (backend -> m a) -> m a withSqlConn open f = do logFunc <- askLogFunc withRunInIO $ \run -> bracket (open logFunc) close' (run . f) close' :: (BackendCompatible SqlBackend backend) => backend -> IO () close' conn = do readIORef (connStmtMap $ projectBackend conn) >>= mapM_ stmtFinalize . Map.elems connClose $ projectBackend conn persistent-2.10.5.2/Database/Persist/Sql/Class.hs0000644000000000000000000004444513620641325017637 0ustar0000000000000000{-# LANGUAGE CPP #-} {-# LANGUAGE FlexibleInstances #-} {-# LANGUAGE PatternGuards #-} {-# LANGUAGE ScopedTypeVariables #-} {-# LANGUAGE TypeSynonymInstances #-} {-# LANGUAGE UndecidableInstances #-} {-# LANGUAGE DataKinds #-} module Database.Persist.Sql.Class ( RawSql (..) , PersistFieldSql (..) , EntityWithPrefix(..) , unPrefix ) where import GHC.TypeLits (Symbol, KnownSymbol, symbolVal) import Data.Bits (bitSizeMaybe) import Data.ByteString (ByteString) import Data.Fixed import Data.Int import qualified Data.IntMap as IM import qualified Data.Map as M import Data.Maybe (fromMaybe) import Data.Monoid ((<>)) import Data.Proxy (Proxy(..)) import qualified Data.Set as S import Data.Text (Text, intercalate, pack) import qualified Data.Text as T import qualified Data.Text.Lazy as TL import Data.Time (UTCTime, TimeOfDay, Day) import qualified Data.Vector as V import Data.Word import Numeric.Natural (Natural) import Text.Blaze.Html (Html) import Database.Persist import Database.Persist.Sql.Types -- | Class for data types that may be retrived from a 'rawSql' -- query. class RawSql a where -- | Number of columns that this data type needs and the list -- of substitutions for @SELECT@ placeholders @??@. rawSqlCols :: (DBName -> Text) -> a -> (Int, [Text]) -- | A string telling the user why the column count is what -- it is. rawSqlColCountReason :: a -> String -- | Transform a row of the result into the data type. rawSqlProcessRow :: [PersistValue] -> Either Text a instance PersistField a => RawSql (Single a) where rawSqlCols _ _ = (1, []) rawSqlColCountReason _ = "one column for a 'Single' data type" rawSqlProcessRow [pv] = Single <$> fromPersistValue pv rawSqlProcessRow _ = Left $ pack "RawSql (Single a): wrong number of columns." instance (PersistEntity a, PersistEntityBackend a ~ backend, IsPersistBackend backend) => RawSql (Key a) where rawSqlCols _ key = (length $ keyToValues key, []) rawSqlColCountReason key = "The primary key is composed of " ++ (show $ length $ keyToValues key) ++ " columns" rawSqlProcessRow = keyFromValues instance (PersistEntity record, PersistEntityBackend record ~ backend, IsPersistBackend backend) => RawSql (Entity record) where rawSqlCols escape _ent = (length sqlFields, [intercalate ", " sqlFields]) where sqlFields = map (((name <> ".") <>) . escape) $ map fieldDB -- Hacky for a composite key because -- it selects the same field multiple times $ entityKeyFields entDef ++ entityFields entDef name = escape (entityDB entDef) entDef = entityDef (Nothing :: Maybe record) rawSqlColCountReason a = case fst (rawSqlCols (error "RawSql") a) of 1 -> "one column for an 'Entity' data type without fields" n -> show n ++ " columns for an 'Entity' data type" rawSqlProcessRow row = case splitAt nKeyFields row of (rowKey, rowVal) -> Entity <$> keyFromValues rowKey <*> fromPersistValues rowVal where nKeyFields = length $ entityKeyFields entDef entDef = entityDef (Nothing :: Maybe record) -- | This newtype wrapper is useful when selecting an entity out of the -- database and you want to provide a prefix to the table being selected. -- -- Consider this raw SQL query: -- -- > SELECT ?? -- > FROM my_long_table_name AS mltn -- > INNER JOIN other_table AS ot -- > ON mltn.some_col = ot.other_col -- > WHERE ... -- -- We don't want to refer to @my_long_table_name@ every time, so we create -- an alias. If we want to select it, we have to tell the raw SQL -- quasi-quoter that we expect the entity to be prefixed with some other -- name. -- -- We can give the above query a type with this, like: -- -- @ -- getStuff :: 'SqlPersistM' ['EntityWithPrefix' \"mltn\" MyLongTableName] -- getStuff = rawSql queryText [] -- @ -- -- The 'EntityWithPrefix' bit is a boilerplate newtype wrapper, so you can -- remove it with 'unPrefix', like this: -- -- @ -- getStuff :: 'SqlPersistM' ['Entity' MyLongTableName] -- getStuff = 'unPrefix' @\"mltn\" '<$>' 'rawSql' queryText [] -- @ -- -- The @ symbol is a "type application" and requires the @TypeApplications@ -- language extension. -- -- @since 2.10.5 newtype EntityWithPrefix (prefix :: Symbol) record = EntityWithPrefix { unEntityWithPrefix :: Entity record } -- | A helper function to tell GHC what the 'EntityWithPrefix' prefix -- should be. This allows you to use a type application to specify the -- prefix, instead of specifying the etype on the result. -- -- As an example, here's code that uses this: -- -- @ -- myQuery :: 'SqlPersistM' ['Entity' Person] -- myQuery = map (unPrefix @\"p\") <$> rawSql query [] -- where -- query = "SELECT ?? FROM person AS p" -- @ -- -- @since 2.10.5 unPrefix :: forall prefix record. EntityWithPrefix prefix record -> Entity record unPrefix = unEntityWithPrefix instance ( PersistEntity record , KnownSymbol prefix , PersistEntityBackend record ~ backend , IsPersistBackend backend ) => RawSql (EntityWithPrefix prefix record) where rawSqlCols escape _ent = (length sqlFields, [intercalate ", " sqlFields]) where sqlFields = map (((name <> ".") <>) . escape) $ map fieldDB -- Hacky for a composite key because -- it selects the same field multiple times $ entityKeyFields entDef ++ entityFields entDef name = pack $ symbolVal (Proxy :: Proxy prefix) entDef = entityDef (Nothing :: Maybe record) rawSqlColCountReason a = case fst (rawSqlCols (error "RawSql") a) of 1 -> "one column for an 'Entity' data type without fields" n -> show n ++ " columns for an 'Entity' data type" rawSqlProcessRow row = case splitAt nKeyFields row of (rowKey, rowVal) -> fmap EntityWithPrefix $ Entity <$> keyFromValues rowKey <*> fromPersistValues rowVal where nKeyFields = length $ entityKeyFields entDef entDef = entityDef (Nothing :: Maybe record) -- | @since 1.0.1 instance RawSql a => RawSql (Maybe a) where rawSqlCols e = rawSqlCols e . extractMaybe rawSqlColCountReason = rawSqlColCountReason . extractMaybe rawSqlProcessRow cols | all isNull cols = return Nothing | otherwise = case rawSqlProcessRow cols of Right v -> Right (Just v) Left msg -> Left $ "RawSql (Maybe a): not all columns were Null " <> "but the inner parser has failed. Its message " <> "was \"" <> msg <> "\". Did you apply Maybe " <> "to a tuple, perhaps? The main use case for " <> "Maybe is to allow OUTER JOINs to be written, " <> "in which case 'Maybe (Entity v)' is used." where isNull PersistNull = True isNull _ = False instance (RawSql a, RawSql b) => RawSql (a, b) where rawSqlCols e x = rawSqlCols e (fst x) # rawSqlCols e (snd x) where (cnta, lsta) # (cntb, lstb) = (cnta + cntb, lsta ++ lstb) rawSqlColCountReason x = rawSqlColCountReason (fst x) ++ ", " ++ rawSqlColCountReason (snd x) rawSqlProcessRow = let x = getType processRow getType :: (z -> Either y x) -> x getType = error "RawSql.getType" colCountFst = fst $ rawSqlCols (error "RawSql.getType2") (fst x) processRow row = let (rowFst, rowSnd) = splitAt colCountFst row in (,) <$> rawSqlProcessRow rowFst <*> rawSqlProcessRow rowSnd in colCountFst `seq` processRow -- Avoids recalculating 'colCountFst'. instance (RawSql a, RawSql b, RawSql c) => RawSql (a, b, c) where rawSqlCols e = rawSqlCols e . from3 rawSqlColCountReason = rawSqlColCountReason . from3 rawSqlProcessRow = fmap to3 . rawSqlProcessRow from3 :: (a,b,c) -> ((a,b),c) from3 (a,b,c) = ((a,b),c) to3 :: ((a,b),c) -> (a,b,c) to3 ((a,b),c) = (a,b,c) instance (RawSql a, RawSql b, RawSql c, RawSql d) => RawSql (a, b, c, d) where rawSqlCols e = rawSqlCols e . from4 rawSqlColCountReason = rawSqlColCountReason . from4 rawSqlProcessRow = fmap to4 . rawSqlProcessRow from4 :: (a,b,c,d) -> ((a,b),(c,d)) from4 (a,b,c,d) = ((a,b),(c,d)) to4 :: ((a,b),(c,d)) -> (a,b,c,d) to4 ((a,b),(c,d)) = (a,b,c,d) instance (RawSql a, RawSql b, RawSql c, RawSql d, RawSql e) => RawSql (a, b, c, d, e) where rawSqlCols e = rawSqlCols e . from5 rawSqlColCountReason = rawSqlColCountReason . from5 rawSqlProcessRow = fmap to5 . rawSqlProcessRow from5 :: (a,b,c,d,e) -> ((a,b),(c,d),e) from5 (a,b,c,d,e) = ((a,b),(c,d),e) to5 :: ((a,b),(c,d),e) -> (a,b,c,d,e) to5 ((a,b),(c,d),e) = (a,b,c,d,e) instance (RawSql a, RawSql b, RawSql c, RawSql d, RawSql e, RawSql f) => RawSql (a, b, c, d, e, f) where rawSqlCols e = rawSqlCols e . from6 rawSqlColCountReason = rawSqlColCountReason . from6 rawSqlProcessRow = fmap to6 . rawSqlProcessRow from6 :: (a,b,c,d,e,f) -> ((a,b),(c,d),(e,f)) from6 (a,b,c,d,e,f) = ((a,b),(c,d),(e,f)) to6 :: ((a,b),(c,d),(e,f)) -> (a,b,c,d,e,f) to6 ((a,b),(c,d),(e,f)) = (a,b,c,d,e,f) instance (RawSql a, RawSql b, RawSql c, RawSql d, RawSql e, RawSql f, RawSql g) => RawSql (a, b, c, d, e, f, g) where rawSqlCols e = rawSqlCols e . from7 rawSqlColCountReason = rawSqlColCountReason . from7 rawSqlProcessRow = fmap to7 . rawSqlProcessRow from7 :: (a,b,c,d,e,f,g) -> ((a,b),(c,d),(e,f),g) from7 (a,b,c,d,e,f,g) = ((a,b),(c,d),(e,f),g) to7 :: ((a,b),(c,d),(e,f),g) -> (a,b,c,d,e,f,g) to7 ((a,b),(c,d),(e,f),g) = (a,b,c,d,e,f,g) instance (RawSql a, RawSql b, RawSql c, RawSql d, RawSql e, RawSql f, RawSql g, RawSql h) => RawSql (a, b, c, d, e, f, g, h) where rawSqlCols e = rawSqlCols e . from8 rawSqlColCountReason = rawSqlColCountReason . from8 rawSqlProcessRow = fmap to8 . rawSqlProcessRow from8 :: (a,b,c,d,e,f,g,h) -> ((a,b),(c,d),(e,f),(g,h)) from8 (a,b,c,d,e,f,g,h) = ((a,b),(c,d),(e,f),(g,h)) to8 :: ((a,b),(c,d),(e,f),(g,h)) -> (a,b,c,d,e,f,g,h) to8 ((a,b),(c,d),(e,f),(g,h)) = (a,b,c,d,e,f,g,h) -- | @since 2.10.2 instance (RawSql a, RawSql b, RawSql c, RawSql d, RawSql e, RawSql f, RawSql g, RawSql h, RawSql i) => RawSql (a, b, c, d, e, f, g, h, i) where rawSqlCols e = rawSqlCols e . from9 rawSqlColCountReason = rawSqlColCountReason . from9 rawSqlProcessRow = fmap to9 . rawSqlProcessRow -- | @since 2.10.2 from9 :: (a,b,c,d,e,f,g,h,i) -> ((a,b),(c,d),(e,f),(g,h),i) from9 (a,b,c,d,e,f,g,h,i) = ((a,b),(c,d),(e,f),(g,h),i) -- | @since 2.10.2 to9 :: ((a,b),(c,d),(e,f),(g,h),i) -> (a,b,c,d,e,f,g,h,i) to9 ((a,b),(c,d),(e,f),(g,h),i) = (a,b,c,d,e,f,g,h,i) -- | @since 2.10.2 instance (RawSql a, RawSql b, RawSql c, RawSql d, RawSql e, RawSql f, RawSql g, RawSql h, RawSql i, RawSql j) => RawSql (a, b, c, d, e, f, g, h, i, j) where rawSqlCols e = rawSqlCols e . from10 rawSqlColCountReason = rawSqlColCountReason . from10 rawSqlProcessRow = fmap to10 . rawSqlProcessRow -- | @since 2.10.2 from10 :: (a,b,c,d,e,f,g,h,i,j) -> ((a,b),(c,d),(e,f),(g,h),(i,j)) from10 (a,b,c,d,e,f,g,h,i,j) = ((a,b),(c,d),(e,f),(g,h),(i,j)) -- | @since 2.10.2 to10 :: ((a,b),(c,d),(e,f),(g,h),(i,j)) -> (a,b,c,d,e,f,g,h,i,j) to10 ((a,b),(c,d),(e,f),(g,h),(i,j)) = (a,b,c,d,e,f,g,h,i,j) -- | @since 2.10.2 instance (RawSql a, RawSql b, RawSql c, RawSql d, RawSql e, RawSql f, RawSql g, RawSql h, RawSql i, RawSql j, RawSql k) => RawSql (a, b, c, d, e, f, g, h, i, j, k) where rawSqlCols e = rawSqlCols e . from11 rawSqlColCountReason = rawSqlColCountReason . from11 rawSqlProcessRow = fmap to11 . rawSqlProcessRow -- | @since 2.10.2 from11 :: (a,b,c,d,e,f,g,h,i,j,k) -> ((a,b),(c,d),(e,f),(g,h),(i,j),k) from11 (a,b,c,d,e,f,g,h,i,j,k) = ((a,b),(c,d),(e,f),(g,h),(i,j),k) -- | @since 2.10.2 to11 :: ((a,b),(c,d),(e,f),(g,h),(i,j),k) -> (a,b,c,d,e,f,g,h,i,j,k) to11 ((a,b),(c,d),(e,f),(g,h),(i,j),k) = (a,b,c,d,e,f,g,h,i,j,k) -- | @since 2.10.2 instance (RawSql a, RawSql b, RawSql c, RawSql d, RawSql e, RawSql f, RawSql g, RawSql h, RawSql i, RawSql j, RawSql k, RawSql l) => RawSql (a, b, c, d, e, f, g, h, i, j, k, l) where rawSqlCols e = rawSqlCols e . from12 rawSqlColCountReason = rawSqlColCountReason . from12 rawSqlProcessRow = fmap to12 . rawSqlProcessRow -- | @since 2.10.2 from12 :: (a,b,c,d,e,f,g,h,i,j,k,l) -> ((a,b),(c,d),(e,f),(g,h),(i,j),(k,l)) from12 (a,b,c,d,e,f,g,h,i,j,k,l) = ((a,b),(c,d),(e,f),(g,h),(i,j),(k,l)) -- | @since 2.10.2 to12 :: ((a,b),(c,d),(e,f),(g,h),(i,j),(k,l)) -> (a,b,c,d,e,f,g,h,i,j,k,l) to12 ((a,b),(c,d),(e,f),(g,h),(i,j),(k,l)) = (a,b,c,d,e,f,g,h,i,j,k,l) extractMaybe :: Maybe a -> a extractMaybe = fromMaybe (error "Database.Persist.GenericSql.extractMaybe") -- | Tells Persistent what database column type should be used to store a Haskell type. -- -- ==== __Examples__ -- -- ===== Simple Boolean Alternative -- -- @ -- data Switch = On | Off -- deriving (Show, Eq) -- -- instance 'PersistField' Switch where -- 'toPersistValue' s = case s of -- On -> 'PersistBool' True -- Off -> 'PersistBool' False -- 'fromPersistValue' ('PersistBool' b) = if b then 'Right' On else 'Right' Off -- 'fromPersistValue' x = Left $ "File.hs: When trying to deserialize a Switch: expected PersistBool, received: " <> T.pack (show x) -- -- instance 'PersistFieldSql' Switch where -- 'sqlType' _ = 'SqlBool' -- @ -- -- ===== Non-Standard Database Types -- -- If your database supports non-standard types, such as Postgres' @uuid@, you can use 'SqlOther' to use them: -- -- @ -- import qualified Data.UUID as UUID -- instance 'PersistField' UUID where -- 'toPersistValue' = 'PersistDbSpecific' . toASCIIBytes -- 'fromPersistValue' ('PersistDbSpecific' uuid) = -- case fromASCIIBytes uuid of -- 'Nothing' -> 'Left' $ "Model/CustomTypes.hs: Failed to deserialize a UUID; received: " <> T.pack (show uuid) -- 'Just' uuid' -> 'Right' uuid' -- 'fromPersistValue' x = Left $ "File.hs: When trying to deserialize a UUID: expected PersistDbSpecific, received: "-- > <> T.pack (show x) -- -- instance 'PersistFieldSql' UUID where -- 'sqlType' _ = 'SqlOther' "uuid" -- @ -- -- ===== User Created Database Types -- -- Similarly, some databases support creating custom types, e.g. Postgres' and features. You can use 'SqlOther' to specify a custom type: -- -- > CREATE DOMAIN ssn AS text -- > CHECK ( value ~ '^[0-9]{9}$'); -- -- @ -- instance 'PersistFieldSQL' SSN where -- 'sqlType' _ = 'SqlOther' "ssn" -- @ -- -- > CREATE TYPE rainbow_color AS ENUM ('red', 'orange', 'yellow', 'green', 'blue', 'indigo', 'violet'); -- -- @ -- instance 'PersistFieldSQL' RainbowColor where -- 'sqlType' _ = 'SqlOther' "rainbow_color" -- @ class PersistField a => PersistFieldSql a where sqlType :: Proxy a -> SqlType #ifndef NO_OVERLAP instance {-# OVERLAPPING #-} PersistFieldSql [Char] where sqlType _ = SqlString #endif instance PersistFieldSql ByteString where sqlType _ = SqlBlob instance PersistFieldSql T.Text where sqlType _ = SqlString instance PersistFieldSql TL.Text where sqlType _ = SqlString instance PersistFieldSql Html where sqlType _ = SqlString instance PersistFieldSql Int where sqlType _ | Just x <- bitSizeMaybe (0 :: Int), x <= 32 = SqlInt32 | otherwise = SqlInt64 instance PersistFieldSql Int8 where sqlType _ = SqlInt32 instance PersistFieldSql Int16 where sqlType _ = SqlInt32 instance PersistFieldSql Int32 where sqlType _ = SqlInt32 instance PersistFieldSql Int64 where sqlType _ = SqlInt64 instance PersistFieldSql Word where sqlType _ = SqlInt64 instance PersistFieldSql Word8 where sqlType _ = SqlInt32 instance PersistFieldSql Word16 where sqlType _ = SqlInt32 instance PersistFieldSql Word32 where sqlType _ = SqlInt64 instance PersistFieldSql Word64 where sqlType _ = SqlInt64 instance PersistFieldSql Double where sqlType _ = SqlReal instance PersistFieldSql Bool where sqlType _ = SqlBool instance PersistFieldSql Day where sqlType _ = SqlDay instance PersistFieldSql TimeOfDay where sqlType _ = SqlTime instance PersistFieldSql UTCTime where sqlType _ = SqlDayTime instance {-# OVERLAPPABLE #-} PersistFieldSql a => PersistFieldSql [a] where sqlType _ = SqlString instance PersistFieldSql a => PersistFieldSql (V.Vector a) where sqlType _ = SqlString instance (Ord a, PersistFieldSql a) => PersistFieldSql (S.Set a) where sqlType _ = SqlString instance (PersistFieldSql a, PersistFieldSql b) => PersistFieldSql (a,b) where sqlType _ = SqlString instance PersistFieldSql v => PersistFieldSql (IM.IntMap v) where sqlType _ = SqlString instance PersistFieldSql v => PersistFieldSql (M.Map T.Text v) where sqlType _ = SqlString instance PersistFieldSql PersistValue where sqlType _ = SqlInt64 -- since PersistValue should only be used like this for keys, which in SQL are Int64 instance PersistFieldSql Checkmark where sqlType _ = SqlBool instance (HasResolution a) => PersistFieldSql (Fixed a) where sqlType a = SqlNumeric long prec where prec = round $ (log $ fromIntegral $ resolution n) / (log 10 :: Double) -- FIXME: May lead to problems with big numbers long = prec + 10 -- FIXME: Is this enough ? n = 0 _mn = return n `asTypeOf` a instance PersistFieldSql Rational where sqlType _ = SqlNumeric 32 20 -- need to make this field big enough to handle Rational to Mumber string conversion for ODBC instance PersistFieldSql Natural where sqlType _ = SqlInt64 -- An embedded Entity instance (PersistField record, PersistEntity record) => PersistFieldSql (Entity record) where sqlType _ = SqlString persistent-2.10.5.2/Database/Persist/Sql/Orphan/PersistQuery.hs0000644000000000000000000004331313513622673022477 0ustar0000000000000000{-# LANGUAGE ExistentialQuantification #-} {-# LANGUAGE LambdaCase #-} {-# OPTIONS_GHC -fno-warn-orphans #-} module Database.Persist.Sql.Orphan.PersistQuery ( deleteWhereCount , updateWhereCount , decorateSQLWithLimitOffset ) where import Control.Exception (throwIO) import Control.Monad.IO.Class import Control.Monad.Trans.Reader (ReaderT, ask, withReaderT) import Data.ByteString.Char8 (readInteger) import Data.Conduit import qualified Data.Conduit.List as CL import Data.Int (Int64) import Data.List (transpose, inits, find) import Data.Maybe (isJust) import Data.Monoid (Monoid (..), (<>)) import qualified Data.Text as T import Data.Text (Text) import Database.Persist hiding (updateField) import Database.Persist.Sql.Util ( entityColumnNames, parseEntityValues, isIdField, updatePersistValue , mkUpdateText, commaSeparated, dbIdColumns) import Database.Persist.Sql.Types import Database.Persist.Sql.Raw import Database.Persist.Sql.Orphan.PersistStore (withRawQuery) -- orphaned instance for convenience of modularity instance PersistQueryRead SqlBackend where count filts = do conn <- ask let wher = if null filts then "" else filterClause False conn filts let sql = mconcat [ "SELECT COUNT(*) FROM " , connEscapeName conn $ entityDB t , wher ] withRawQuery sql (getFiltsValues conn filts) $ do mm <- CL.head case mm of Just [PersistInt64 i] -> return $ fromIntegral i Just [PersistDouble i] ->return $ fromIntegral (truncate i :: Int64) -- gb oracle Just [PersistByteString i] -> case readInteger i of -- gb mssql Just (ret,"") -> return $ fromIntegral ret xs -> error $ "invalid number i["++show i++"] xs[" ++ show xs ++ "]" Just xs -> error $ "count:invalid sql return xs["++show xs++"] sql["++show sql++"]" Nothing -> error $ "count:invalid sql returned nothing sql["++show sql++"]" where t = entityDef $ dummyFromFilts filts selectSourceRes filts opts = do conn <- ask srcRes <- rawQueryRes (sql conn) (getFiltsValues conn filts) return $ fmap (.| CL.mapM parse) srcRes where (limit, offset, orders) = limitOffsetOrder opts parse vals = case parseEntityValues t vals of Left s -> liftIO $ throwIO $ PersistMarshalError s Right row -> return row t = entityDef $ dummyFromFilts filts wher conn = if null filts then "" else filterClause False conn filts ord conn = case map (orderClause False conn) orders of [] -> "" ords -> " ORDER BY " <> T.intercalate "," ords cols = commaSeparated . entityColumnNames t sql conn = connLimitOffset conn (limit,offset) (not (null orders)) $ mconcat [ "SELECT " , cols conn , " FROM " , connEscapeName conn $ entityDB t , wher conn , ord conn ] selectKeysRes filts opts = do conn <- ask srcRes <- rawQueryRes (sql conn) (getFiltsValues conn filts) return $ fmap (.| CL.mapM parse) srcRes where t = entityDef $ dummyFromFilts filts cols conn = T.intercalate "," $ dbIdColumns conn t wher conn = if null filts then "" else filterClause False conn filts sql conn = connLimitOffset conn (limit,offset) (not (null orders)) $ mconcat [ "SELECT " , cols conn , " FROM " , connEscapeName conn $ entityDB t , wher conn , ord conn ] (limit, offset, orders) = limitOffsetOrder opts ord conn = case map (orderClause False conn) orders of [] -> "" ords -> " ORDER BY " <> T.intercalate "," ords parse xs = do keyvals <- case entityPrimary t of Nothing -> case xs of [PersistInt64 x] -> return [PersistInt64 x] [PersistDouble x] -> return [PersistInt64 (truncate x)] -- oracle returns Double _ -> return xs Just pdef -> let pks = map fieldHaskell $ compositeFields pdef keyvals = map snd $ filter (\(a, _) -> let ret=isJust (find (== a) pks) in ret) $ zip (map fieldHaskell $ entityFields t) xs in return keyvals case keyFromValues keyvals of Right k -> return k Left err -> error $ "selectKeysImpl: keyFromValues failed" <> show err instance PersistQueryRead SqlReadBackend where count filts = withReaderT persistBackend $ count filts selectSourceRes filts opts = withReaderT persistBackend $ selectSourceRes filts opts selectKeysRes filts opts = withReaderT persistBackend $ selectKeysRes filts opts instance PersistQueryRead SqlWriteBackend where count filts = withReaderT persistBackend $ count filts selectSourceRes filts opts = withReaderT persistBackend $ selectSourceRes filts opts selectKeysRes filts opts = withReaderT persistBackend $ selectKeysRes filts opts instance PersistQueryWrite SqlBackend where deleteWhere filts = do _ <- deleteWhereCount filts return () updateWhere filts upds = do _ <- updateWhereCount filts upds return () instance PersistQueryWrite SqlWriteBackend where deleteWhere filts = withReaderT persistBackend $ deleteWhere filts updateWhere filts upds = withReaderT persistBackend $ updateWhere filts upds -- | Same as 'deleteWhere', but returns the number of rows affected. -- -- @since 1.1.5 deleteWhereCount :: (PersistEntity val, MonadIO m, PersistEntityBackend val ~ SqlBackend, BackendCompatible SqlBackend backend) => [Filter val] -> ReaderT backend m Int64 deleteWhereCount filts = withReaderT projectBackend $ do conn <- ask let t = entityDef $ dummyFromFilts filts let wher = if null filts then "" else filterClause False conn filts sql = mconcat [ "DELETE FROM " , connEscapeName conn $ entityDB t , wher ] rawExecuteCount sql $ getFiltsValues conn filts -- | Same as 'updateWhere', but returns the number of rows affected. -- -- @since 1.1.5 updateWhereCount :: (PersistEntity val, MonadIO m, SqlBackend ~ PersistEntityBackend val, BackendCompatible SqlBackend backend) => [Filter val] -> [Update val] -> ReaderT backend m Int64 updateWhereCount _ [] = return 0 updateWhereCount filts upds = withReaderT projectBackend $ do conn <- ask let wher = if null filts then "" else filterClause False conn filts let sql = mconcat [ "UPDATE " , connEscapeName conn $ entityDB t , " SET " , T.intercalate "," $ map (mkUpdateText conn) upds , wher ] let dat = map updatePersistValue upds `Data.Monoid.mappend` getFiltsValues conn filts rawExecuteCount sql dat where t = entityDef $ dummyFromFilts filts fieldName :: forall record typ. (PersistEntity record, PersistEntityBackend record ~ SqlBackend) => EntityField record typ -> DBName fieldName f = fieldDB $ persistFieldDef f dummyFromFilts :: [Filter v] -> Maybe v dummyFromFilts _ = Nothing getFiltsValues :: forall val. (PersistEntity val, PersistEntityBackend val ~ SqlBackend) => SqlBackend -> [Filter val] -> [PersistValue] getFiltsValues conn = snd . filterClauseHelper False False conn OrNullNo data OrNull = OrNullYes | OrNullNo filterClauseHelper :: (PersistEntity val, PersistEntityBackend val ~ SqlBackend) => Bool -- ^ include table name? -> Bool -- ^ include WHERE? -> SqlBackend -> OrNull -> [Filter val] -> (Text, [PersistValue]) filterClauseHelper includeTable includeWhere conn orNull filters = (if not (T.null sql) && includeWhere then " WHERE " <> sql else sql, vals) where (sql, vals) = combineAND filters combineAND = combine " AND " combine s fs = (T.intercalate s $ map wrapP a, mconcat b) where (a, b) = unzip $ map go fs wrapP x = T.concat ["(", x, ")"] go (BackendFilter _) = error "BackendFilter not expected" go (FilterAnd []) = ("1=1", []) go (FilterAnd fs) = combineAND fs go (FilterOr []) = ("1=0", []) go (FilterOr fs) = combine " OR " fs go (Filter field value pfilter) = let t = entityDef $ dummyFromFilts [Filter field value pfilter] in case (isIdField field, entityPrimary t, allVals) of (True, Just pdef, PersistList ys:_) -> if length (compositeFields pdef) /= length ys then error $ "wrong number of entries in compositeFields vs PersistList allVals=" ++ show allVals else case (allVals, pfilter, isCompFilter pfilter) of ([PersistList xs], Eq, _) -> let sqlcl=T.intercalate " and " (map (\a -> connEscapeName conn (fieldDB a) <> showSqlFilter pfilter <> "? ") (compositeFields pdef)) in (wrapSql sqlcl,xs) ([PersistList xs], Ne, _) -> let sqlcl=T.intercalate " or " (map (\a -> connEscapeName conn (fieldDB a) <> showSqlFilter pfilter <> "? ") (compositeFields pdef)) in (wrapSql sqlcl,xs) (_, In, _) -> let xxs = transpose (map fromPersistList allVals) sqls=map (\(a,xs) -> connEscapeName conn (fieldDB a) <> showSqlFilter pfilter <> "(" <> T.intercalate "," (replicate (length xs) " ?") <> ") ") (zip (compositeFields pdef) xxs) in (wrapSql (T.intercalate " and " (map wrapSql sqls)), concat xxs) (_, NotIn, _) -> let xxs = transpose (map fromPersistList allVals) sqls=map (\(a,xs) -> connEscapeName conn (fieldDB a) <> showSqlFilter pfilter <> "(" <> T.intercalate "," (replicate (length xs) " ?") <> ") ") (zip (compositeFields pdef) xxs) in (wrapSql (T.intercalate " or " (map wrapSql sqls)), concat xxs) ([PersistList xs], _, True) -> let zs = tail (inits (compositeFields pdef)) sql1 = map (\b -> wrapSql (T.intercalate " and " (map (\(i,a) -> sql2 (i==length b) a) (zip [1..] b)))) zs sql2 islast a = connEscapeName conn (fieldDB a) <> (if islast then showSqlFilter pfilter else showSqlFilter Eq) <> "? " sqlcl = T.intercalate " or " sql1 in (wrapSql sqlcl, concat (tail (inits xs))) (_, BackendSpecificFilter _, _) -> error "unhandled type BackendSpecificFilter for composite/non id primary keys" _ -> error $ "unhandled type/filter for composite/non id primary keys pfilter=" ++ show pfilter ++ " persistList="++show allVals (True, Just pdef, []) -> error $ "empty list given as filter value filter=" ++ show pfilter ++ " persistList=" ++ show allVals ++ " pdef=" ++ show pdef (True, Just pdef, _) -> error $ "unhandled error for composite/non id primary keys filter=" ++ show pfilter ++ " persistList=" ++ show allVals ++ " pdef=" ++ show pdef _ -> case (isNull, pfilter, length notNullVals) of (True, Eq, _) -> (name <> " IS NULL", []) (True, Ne, _) -> (name <> " IS NOT NULL", []) (False, Ne, _) -> (T.concat [ "(" , name , " IS NULL OR " , name , " <> " , qmarks , ")" ], notNullVals) -- We use 1=2 (and below 1=1) to avoid using TRUE and FALSE, since -- not all databases support those words directly. (_, In, 0) -> ("1=2" <> orNullSuffix, []) (False, In, _) -> (name <> " IN " <> qmarks <> orNullSuffix, allVals) (True, In, _) -> (T.concat [ "(" , name , " IS NULL OR " , name , " IN " , qmarks , ")" ], notNullVals) (False, NotIn, 0) -> ("1=1", []) (True, NotIn, 0) -> (name <> " IS NOT NULL", []) (False, NotIn, _) -> (T.concat [ "(" , name , " IS NULL OR " , name , " NOT IN " , qmarks , ")" ], notNullVals) (True, NotIn, _) -> (T.concat [ "(" , name , " IS NOT NULL AND " , name , " NOT IN " , qmarks , ")" ], notNullVals) _ -> (name <> showSqlFilter pfilter <> "?" <> orNullSuffix, allVals) where isCompFilter Lt = True isCompFilter Le = True isCompFilter Gt = True isCompFilter Ge = True isCompFilter _ = False wrapSql sqlcl = "(" <> sqlcl <> ")" fromPersistList (PersistList xs) = xs fromPersistList other = error $ "expected PersistList but found " ++ show other filterValueToPersistValues :: forall a. PersistField a => FilterValue a -> [PersistValue] filterValueToPersistValues = \case FilterValue a -> [toPersistValue a] FilterValues as -> toPersistValue <$> as UnsafeValue x -> [toPersistValue x] orNullSuffix = case orNull of OrNullYes -> mconcat [" OR ", name, " IS NULL"] OrNullNo -> "" isNull = PersistNull `elem` allVals notNullVals = filter (/= PersistNull) allVals allVals = filterValueToPersistValues value tn = connEscapeName conn $ entityDB $ entityDef $ dummyFromFilts [Filter field value pfilter] name = (if includeTable then ((tn <> ".") <>) else id) $ connEscapeName conn $ fieldName field qmarks = case value of FilterValue{} -> "(?)" UnsafeValue{} -> "(?)" FilterValues xs -> let parens a = "(" <> a <> ")" commas = T.intercalate "," toQs = fmap $ const "?" nonNulls = filter (/= PersistNull) $ map toPersistValue xs in parens . commas . toQs $ nonNulls showSqlFilter Eq = "=" showSqlFilter Ne = "<>" showSqlFilter Gt = ">" showSqlFilter Lt = "<" showSqlFilter Ge = ">=" showSqlFilter Le = "<=" showSqlFilter In = " IN " showSqlFilter NotIn = " NOT IN " showSqlFilter (BackendSpecificFilter s) = s filterClause :: (PersistEntity val, PersistEntityBackend val ~ SqlBackend) => Bool -- ^ include table name? -> SqlBackend -> [Filter val] -> Text filterClause b c = fst . filterClauseHelper b True c OrNullNo orderClause :: (PersistEntity val, PersistEntityBackend val ~ SqlBackend) => Bool -- ^ include the table name -> SqlBackend -> SelectOpt val -> Text orderClause includeTable conn o = case o of Asc x -> name x Desc x -> name x <> " DESC" _ -> error "orderClause: expected Asc or Desc, not limit or offset" where dummyFromOrder :: SelectOpt a -> Maybe a dummyFromOrder _ = Nothing tn = connEscapeName conn $ entityDB $ entityDef $ dummyFromOrder o name :: (PersistEntityBackend record ~ SqlBackend, PersistEntity record) => EntityField record typ -> Text name x = (if includeTable then ((tn <> ".") <>) else id) $ connEscapeName conn $ fieldName x -- | Generates sql for limit and offset for postgres, sqlite and mysql. decorateSQLWithLimitOffset::Text -> (Int,Int) -> Bool -> Text -> Text decorateSQLWithLimitOffset nolimit (limit,offset) _ sql = let lim = case (limit, offset) of (0, 0) -> "" (0, _) -> T.cons ' ' nolimit (_, _) -> " LIMIT " <> T.pack (show limit) off = if offset == 0 then "" else " OFFSET " <> T.pack (show offset) in mconcat [ sql , lim , off ] persistent-2.10.5.2/Database/Persist/Sql/Orphan/PersistStore.hs0000644000000000000000000003620713621624146022467 0ustar0000000000000000{-# LANGUAGE ExistentialQuantification #-} {-# LANGUAGE GeneralizedNewtypeDeriving #-} {-# OPTIONS_GHC -fno-warn-orphans #-} module Database.Persist.Sql.Orphan.PersistStore ( withRawQuery , BackendKey(..) , toSqlKey , fromSqlKey , getFieldName , getTableName , tableDBName , fieldDBName ) where import Control.Exception (throwIO) import Control.Monad.IO.Class import Control.Monad.Trans.Reader (ReaderT, ask, withReaderT) import Data.Conduit (ConduitM, (.|), runConduit) import qualified Data.Conduit.List as CL import Data.Acquire (with) import qualified Data.Aeson as A import Data.ByteString.Char8 (readInteger) import Data.Conduit (ConduitM, (.|), runConduit) import qualified Data.Conduit.List as CL import qualified Data.Foldable as Foldable import Data.Function (on) import Data.Int (Int64) import Data.List (find, nubBy) import qualified Data.Map as Map import Data.Maybe (isJust) import Data.Monoid (mappend, (<>)) import Data.Text (Text, unpack) import qualified Data.Text as T import Data.Void (Void) import Web.PathPieces (PathPiece) import Web.HttpApiData (ToHttpApiData, FromHttpApiData) import Database.Persist import Database.Persist.Class () import Database.Persist.Sql.Class (PersistFieldSql) import Database.Persist.Sql.Raw import Database.Persist.Sql.Types import Database.Persist.Sql.Util ( dbIdColumns, keyAndEntityColumnNames, parseEntityValues, entityColumnNames , updatePersistValue, mkUpdateText, commaSeparated) withRawQuery :: MonadIO m => Text -> [PersistValue] -> ConduitM [PersistValue] Void IO a -> ReaderT SqlBackend m a withRawQuery sql vals sink = do srcRes <- rawQueryRes sql vals liftIO $ with srcRes (\src -> runConduit $ src .| sink) toSqlKey :: ToBackendKey SqlBackend record => Int64 -> Key record toSqlKey = fromBackendKey . SqlBackendKey fromSqlKey :: ToBackendKey SqlBackend record => Key record -> Int64 fromSqlKey = unSqlBackendKey . toBackendKey whereStmtForKey :: PersistEntity record => SqlBackend -> Key record -> Text whereStmtForKey conn k = T.intercalate " AND " $ map (<> "=? ") $ dbIdColumns conn entDef where entDef = entityDef $ dummyFromKey k whereStmtForKeys :: PersistEntity record => SqlBackend -> [Key record] -> Text whereStmtForKeys conn ks = T.intercalate " OR " $ whereStmtForKey conn `fmap` ks -- | get the SQL string for the table that a PeristEntity represents -- Useful for raw SQL queries -- -- Your backend may provide a more convenient tableName function -- which does not operate in a Monad getTableName :: forall record m backend. ( PersistEntity record , BackendCompatible SqlBackend backend , Monad m ) => record -> ReaderT backend m Text getTableName rec = withReaderT projectBackend $ do conn <- ask return $ connEscapeName conn $ tableDBName rec -- | useful for a backend to implement tableName by adding escaping tableDBName :: (PersistEntity record) => record -> DBName tableDBName rec = entityDB $ entityDef (Just rec) -- | get the SQL string for the field that an EntityField represents -- Useful for raw SQL queries -- -- Your backend may provide a more convenient fieldName function -- which does not operate in a Monad getFieldName :: forall record typ m backend. ( PersistEntity record , PersistEntityBackend record ~ SqlBackend , BackendCompatible SqlBackend backend , Monad m ) => EntityField record typ -> ReaderT backend m Text getFieldName rec = withReaderT projectBackend $ do conn <- ask return $ connEscapeName conn $ fieldDBName rec -- | useful for a backend to implement fieldName by adding escaping fieldDBName :: forall record typ. (PersistEntity record) => EntityField record typ -> DBName fieldDBName = fieldDB . persistFieldDef instance PersistCore SqlBackend where newtype BackendKey SqlBackend = SqlBackendKey { unSqlBackendKey :: Int64 } deriving (Show, Read, Eq, Ord, Num, Integral, PersistField, PersistFieldSql, PathPiece, ToHttpApiData, FromHttpApiData, Real, Enum, Bounded, A.ToJSON, A.FromJSON) instance PersistCore SqlReadBackend where newtype BackendKey SqlReadBackend = SqlReadBackendKey { unSqlReadBackendKey :: Int64 } deriving (Show, Read, Eq, Ord, Num, Integral, PersistField, PersistFieldSql, PathPiece, ToHttpApiData, FromHttpApiData, Real, Enum, Bounded, A.ToJSON, A.FromJSON) instance PersistCore SqlWriteBackend where newtype BackendKey SqlWriteBackend = SqlWriteBackendKey { unSqlWriteBackendKey :: Int64 } deriving (Show, Read, Eq, Ord, Num, Integral, PersistField, PersistFieldSql, PathPiece, ToHttpApiData, FromHttpApiData, Real, Enum, Bounded, A.ToJSON, A.FromJSON) instance BackendCompatible SqlBackend SqlBackend where projectBackend = id instance BackendCompatible SqlBackend SqlReadBackend where projectBackend = unSqlReadBackend instance BackendCompatible SqlBackend SqlWriteBackend where projectBackend = unSqlWriteBackend instance PersistStoreWrite SqlBackend where update _ [] = return () update k upds = do conn <- ask let wher = whereStmtForKey conn k let sql = T.concat [ "UPDATE " , connEscapeName conn $ tableDBName $ recordTypeFromKey k , " SET " , T.intercalate "," $ map (mkUpdateText conn) upds , " WHERE " , wher ] rawExecute sql $ map updatePersistValue upds `mappend` keyToValues k insert val = do conn <- ask let esql = connInsertSql conn t vals key <- case esql of ISRSingle sql -> withRawQuery sql vals $ do x <- CL.head case x of Just [PersistInt64 i] -> case keyFromValues [PersistInt64 i] of Left err -> error $ "SQL insert: keyFromValues: PersistInt64 " `mappend` show i `mappend` " " `mappend` unpack err Right k -> return k Nothing -> error $ "SQL insert did not return a result giving the generated ID" Just vals' -> case keyFromValues vals' of Left e -> error $ "Invalid result from a SQL insert, got: " ++ show vals' ++ ". Error was: " ++ unpack e Right k -> return k ISRInsertGet sql1 sql2 -> do rawExecute sql1 vals withRawQuery sql2 [] $ do mm <- CL.head let m = maybe (Left $ "No results from ISRInsertGet: " `mappend` tshow (sql1, sql2)) Right mm -- TODO: figure out something better for MySQL let convert x = case x of [PersistByteString i] -> case readInteger i of -- mssql Just (ret,"") -> [PersistInt64 $ fromIntegral ret] _ -> x _ -> x -- Yes, it's just <|>. Older bases don't have the -- instance for Either. onLeft Left{} x = x onLeft x _ = x case m >>= (\x -> keyFromValues x `onLeft` keyFromValues (convert x)) of Right k -> return k Left err -> throw $ "ISRInsertGet: keyFromValues failed: " `mappend` err ISRManyKeys sql fs -> do rawExecute sql vals case entityPrimary t of Nothing -> error $ "ISRManyKeys is used when Primary is defined " ++ show sql Just pdef -> let pks = map fieldHaskell $ compositeFields pdef keyvals = map snd $ filter (\(a, _) -> let ret=isJust (find (== a) pks) in ret) $ zip (map fieldHaskell $ entityFields t) fs in case keyFromValues keyvals of Right k -> return k Left e -> error $ "ISRManyKeys: unexpected keyvals result: " `mappend` unpack e return key where tshow :: Show a => a -> Text tshow = T.pack . show throw = liftIO . throwIO . userError . T.unpack t = entityDef $ Just val vals = map toPersistValue $ toPersistFields val insertMany [] = return [] insertMany vals = do conn <- ask case connInsertManySql conn of Nothing -> mapM insert vals Just insertManyFn -> case insertManyFn ent valss of ISRSingle sql -> rawSql sql (concat valss) _ -> error "ISRSingle is expected from the connInsertManySql function" where ent = entityDef vals valss = map (map toPersistValue . toPersistFields) vals insertMany_ vals0 = runChunked (length $ entityFields t) insertMany_' vals0 where t = entityDef vals0 insertMany_' vals = do conn <- ask let valss = map (map toPersistValue . toPersistFields) vals let sql = T.concat [ "INSERT INTO " , connEscapeName conn (entityDB t) , "(" , T.intercalate "," $ map (connEscapeName conn . fieldDB) $ entityFields t , ") VALUES (" , T.intercalate "),(" $ replicate (length valss) $ T.intercalate "," $ map (const "?") (entityFields t) , ")" ] rawExecute sql (concat valss) replace k val = do conn <- ask let t = entityDef $ Just val let wher = whereStmtForKey conn k let sql = T.concat [ "UPDATE " , connEscapeName conn (entityDB t) , " SET " , T.intercalate "," (map (go conn . fieldDB) $ entityFields t) , " WHERE " , wher ] vals = map toPersistValue (toPersistFields val) `mappend` keyToValues k rawExecute sql vals where go conn x = connEscapeName conn x `T.append` "=?" insertKey k v = insrepHelper "INSERT" [Entity k v] insertEntityMany es' = do conn <- ask let entDef = entityDef $ map entityVal es' let columnNames = keyAndEntityColumnNames entDef conn runChunked (length columnNames) go es' where go = insrepHelper "INSERT" repsert key value = do mExisting <- get key case mExisting of Nothing -> insertKey key value Just _ -> replace key value repsertMany [] = return () repsertMany krsDups = do conn <- ask let krs = nubBy ((==) `on` fst) (reverse krsDups) let rs = snd `fmap` krs let ent = entityDef rs let nr = length krs let toVals (k,r) = case entityPrimary ent of Nothing -> keyToValues k <> (toPersistValue <$> toPersistFields r) Just _ -> toPersistValue <$> toPersistFields r case connRepsertManySql conn of (Just mkSql) -> rawExecute (mkSql ent nr) (concatMap toVals krs) Nothing -> mapM_ (uncurry repsert) krs delete k = do conn <- ask rawExecute (sql conn) (keyToValues k) where wher conn = whereStmtForKey conn k sql conn = T.concat [ "DELETE FROM " , connEscapeName conn $ tableDBName $ recordTypeFromKey k , " WHERE " , wher conn ] instance PersistStoreWrite SqlWriteBackend where insert v = withReaderT persistBackend $ insert v insertMany vs = withReaderT persistBackend $ insertMany vs insertMany_ vs = withReaderT persistBackend $ insertMany_ vs insertEntityMany vs = withReaderT persistBackend $ insertEntityMany vs insertKey k v = withReaderT persistBackend $ insertKey k v repsert k v = withReaderT persistBackend $ repsert k v replace k v = withReaderT persistBackend $ replace k v delete k = withReaderT persistBackend $ delete k update k upds = withReaderT persistBackend $ update k upds repsertMany krs = withReaderT persistBackend $ repsertMany krs instance PersistStoreRead SqlBackend where get k = do mEs <- getMany [k] return $ Map.lookup k mEs -- inspired by Database.Persist.Sql.Orphan.PersistQuery.selectSourceRes getMany [] = return Map.empty getMany ks@(k:_)= do conn <- ask let t = entityDef . dummyFromKey $ k let cols = commaSeparated . entityColumnNames t let wher = whereStmtForKeys conn ks let sql = T.concat [ "SELECT " , cols conn , " FROM " , connEscapeName conn $ entityDB t , " WHERE " , wher ] let parse vals = case parseEntityValues t vals of Left s -> liftIO $ throwIO $ PersistMarshalError s Right row -> return row withRawQuery sql (Foldable.foldMap keyToValues ks) $ do es <- CL.mapM parse .| CL.consume return $ Map.fromList $ fmap (\e -> (entityKey e, entityVal e)) es instance PersistStoreRead SqlReadBackend where get k = withReaderT persistBackend $ get k getMany ks = withReaderT persistBackend $ getMany ks instance PersistStoreRead SqlWriteBackend where get k = withReaderT persistBackend $ get k getMany ks = withReaderT persistBackend $ getMany ks dummyFromKey :: Key record -> Maybe record dummyFromKey = Just . recordTypeFromKey recordTypeFromKey :: Key record -> record recordTypeFromKey _ = error "dummyFromKey" insrepHelper :: (MonadIO m, PersistEntity val) => Text -> [Entity val] -> ReaderT SqlBackend m () insrepHelper _ [] = return () insrepHelper command es = do conn <- ask let columnNames = keyAndEntityColumnNames entDef conn rawExecute (sql conn columnNames) vals where entDef = entityDef $ map entityVal es sql conn columnNames = T.concat [ command , " INTO " , connEscapeName conn (entityDB entDef) , "(" , T.intercalate "," columnNames , ") VALUES (" , T.intercalate "),(" $ replicate (length es) $ T.intercalate "," $ map (const "?") columnNames , ")" ] vals = Foldable.foldMap entityValues es runChunked :: (Monad m) => Int -> ([a] -> ReaderT SqlBackend m ()) -> [a] -> ReaderT SqlBackend m () runChunked _ _ [] = return () runChunked width m xs = do conn <- ask case connMaxParams conn of Nothing -> m xs Just maxParams -> let chunkSize = maxParams `div` width in mapM_ m (chunksOf chunkSize xs) -- Implement this here to avoid depending on the split package chunksOf :: Int -> [a] -> [[a]] chunksOf _ [] = [] chunksOf size xs = let (chunk, rest) = splitAt size xs in chunk : chunksOf size rest persistent-2.10.5.2/Database/Persist/Sql/Orphan/PersistUnique.hs0000644000000000000000000001167113464141060022631 0ustar0000000000000000{-# OPTIONS_GHC -fno-warn-orphans #-} module Database.Persist.Sql.Orphan.PersistUnique () where import Control.Exception (throwIO) import Control.Monad.IO.Class (liftIO, MonadIO) import Control.Monad.Trans.Reader (ask, withReaderT, ReaderT) import qualified Data.Conduit.List as CL import Data.Function (on) import Data.List (nubBy) import Data.Monoid (mappend) import qualified Data.Text as T import Database.Persist import Database.Persist.Class.PersistUnique (defaultPutMany, persistUniqueKeyValues, onlyOneUniqueDef) import Database.Persist.Sql.Types import Database.Persist.Sql.Raw import Database.Persist.Sql.Orphan.PersistStore (withRawQuery) import Database.Persist.Sql.Util (dbColumns, parseEntityValues, updatePersistValue, mkUpdateText') defaultUpsert :: ( MonadIO m , PersistEntity record , PersistUniqueWrite backend , PersistEntityBackend record ~ BaseBackend backend , OnlyOneUniqueKey record ) => record -> [Update record] -> ReaderT backend m (Entity record) defaultUpsert record updates = do uniqueKey <- onlyUnique record upsertBy uniqueKey record updates instance PersistUniqueWrite SqlBackend where upsert record updates = do conn <- ask let escape = connEscapeName conn let refCol n = T.concat [escape (entityDB t), ".", n] let mkUpdateText = mkUpdateText' escape refCol uniqueKey <- onlyUnique record case connUpsertSql conn of Just upsertSql -> case updates of [] -> defaultUpsert record updates _:_ -> do let upds = T.intercalate "," $ map mkUpdateText updates sql = upsertSql t (pure (onlyOneUniqueDef (Just record))) upds vals = map toPersistValue (toPersistFields record) ++ map updatePersistValue updates ++ unqs uniqueKey x <- rawSql sql vals return $ head x Nothing -> defaultUpsert record updates where t = entityDef $ Just record unqs uniqueKey = concatMap persistUniqueToValues [uniqueKey] deleteBy uniq = do conn <- ask let sql' = sql conn vals = persistUniqueToValues uniq rawExecute sql' vals where t = entityDef $ dummyFromUnique uniq go = map snd . persistUniqueToFieldNames go' conn x = connEscapeName conn x `mappend` "=?" sql conn = T.concat [ "DELETE FROM " , connEscapeName conn $ entityDB t , " WHERE " , T.intercalate " AND " $ map (go' conn) $ go uniq] putMany [] = return () putMany rsD = do let uKeys = persistUniqueKeys . head $ rsD case uKeys of [] -> insertMany_ rsD _ -> go where go = do let rs = nubBy ((==) `on` persistUniqueKeyValues) (reverse rsD) let ent = entityDef rs let nr = length rs let toVals r = map toPersistValue $ toPersistFields r conn <- ask case connPutManySql conn of (Just mkSql) -> rawExecute (mkSql ent nr) (concatMap toVals rs) Nothing -> defaultPutMany rs instance PersistUniqueWrite SqlWriteBackend where deleteBy uniq = withReaderT persistBackend $ deleteBy uniq upsert rs us = withReaderT persistBackend $ upsert rs us putMany rs = withReaderT persistBackend $ putMany rs instance PersistUniqueRead SqlBackend where getBy uniq = do conn <- ask let sql = T.concat [ "SELECT " , T.intercalate "," $ dbColumns conn t , " FROM " , connEscapeName conn $ entityDB t , " WHERE " , sqlClause conn] uvals = persistUniqueToValues uniq withRawQuery sql uvals $ do row <- CL.head case row of Nothing -> return Nothing Just [] -> error "getBy: empty row" Just vals -> case parseEntityValues t vals of Left err -> liftIO $ throwIO $ PersistMarshalError err Right r -> return $ Just r where sqlClause conn = T.intercalate " AND " $ map (go conn) $ toFieldNames' uniq go conn x = connEscapeName conn x `mappend` "=?" t = entityDef $ dummyFromUnique uniq toFieldNames' = map snd . persistUniqueToFieldNames instance PersistUniqueRead SqlReadBackend where getBy uniq = withReaderT persistBackend $ getBy uniq instance PersistUniqueRead SqlWriteBackend where getBy uniq = withReaderT persistBackend $ getBy uniq dummyFromUnique :: Unique v -> Maybe v dummyFromUnique _ = Nothing persistent-2.10.5.2/test/main.hs0000644000000000000000000005045213621633057014600 0ustar0000000000000000{-# language RecordWildCards #-} import Test.Hspec import qualified Data.Text as T import Data.List.NonEmpty (NonEmpty(..)) import qualified Data.List.NonEmpty as NEL import qualified Data.Map as Map import Database.Persist.Quasi import Database.Persist.Types main :: IO () main = hspec $ do describe "tokenization" $ do it "handles normal words" $ tokenize " foo bar baz" `shouldBe` [ Spaces 1 , Token "foo" , Spaces 3 , Token "bar" , Spaces 2 , Token "baz" ] it "handles quotes" $ tokenize " \"foo bar\" \"baz\"" `shouldBe` [ Spaces 2 , Token "foo bar" , Spaces 2 , Token "baz" ] it "handles quotes mid-token" $ tokenize " x=\"foo bar\" \"baz\"" `shouldBe` [ Spaces 2 , Token "x=foo bar" , Spaces 2 , Token "baz" ] it "handles escaped quote mid-token" $ tokenize " x=\\\"foo bar\" \"baz\"" `shouldBe` [ Spaces 2 , Token "x=\\\"foo" , Spaces 1 , Token "bar\"" , Spaces 2 , Token "baz" ] it "handles unnested parantheses" $ tokenize " (foo bar) (baz)" `shouldBe` [ Spaces 2 , Token "foo bar" , Spaces 2 , Token "baz" ] it "handles unnested parantheses mid-token" $ tokenize " x=(foo bar) (baz)" `shouldBe` [ Spaces 2 , Token "x=foo bar" , Spaces 2 , Token "baz" ] it "handles nested parantheses" $ tokenize " (foo (bar)) (baz)" `shouldBe` [ Spaces 2 , Token "foo (bar)" , Spaces 2 , Token "baz" ] it "escaping" $ tokenize " (foo \\(bar) y=\"baz\\\"\"" `shouldBe` [ Spaces 2 , Token "foo (bar" , Spaces 2 , Token "y=baz\"" ] it "mid-token quote in later token" $ tokenize "foo bar baz=(bin\")" `shouldBe` [ Token "foo" , Spaces 1 , Token "bar" , Spaces 1 , Token "baz=bin\"" ] describe "comments" $ do it "recognizes one line" $ do tokenize "-- | this is a comment" `shouldBe` [ DocComment "-- | this is a comment" ] it "map tokenize" $ do map tokenize ["Foo", "-- | Hello"] `shouldBe` [ [Token "Foo"] , [DocComment "-- | Hello"] ] it "works if comment is indented" $ do tokenize " -- | comment" `shouldBe` [ Spaces 2, DocComment "-- | comment" ] describe "parseFieldType" $ do it "simple types" $ parseFieldType "FooBar" `shouldBe` Right (FTTypeCon Nothing "FooBar") it "module types" $ parseFieldType "Data.Map.FooBar" `shouldBe` Right (FTTypeCon (Just "Data.Map") "FooBar") it "application" $ parseFieldType "Foo Bar" `shouldBe` Right ( FTTypeCon Nothing "Foo" `FTApp` FTTypeCon Nothing "Bar") it "application multiple" $ parseFieldType "Foo Bar Baz" `shouldBe` Right ( (FTTypeCon Nothing "Foo" `FTApp` FTTypeCon Nothing "Bar") `FTApp` FTTypeCon Nothing "Baz" ) it "parens" $ do let foo = FTTypeCon Nothing "Foo" bar = FTTypeCon Nothing "Bar" baz = FTTypeCon Nothing "Baz" parseFieldType "Foo (Bar Baz)" `shouldBe` Right ( foo `FTApp` (bar `FTApp` baz)) it "lists" $ do let foo = FTTypeCon Nothing "Foo" bar = FTTypeCon Nothing "Bar" bars = FTList bar baz = FTTypeCon Nothing "Baz" parseFieldType "Foo [Bar] Baz" `shouldBe` Right ( foo `FTApp` bars `FTApp` baz) describe "preparse" $ do it "recognizes entity" $ do preparse "Person\n name String\n age Int" `shouldBe` [ Line { lineIndent = 0, tokens = ["Person"] } , Line { lineIndent = 2, tokens = ["name", "String"] } , Line { lineIndent = 2, tokens = ["age", "Int"] } ] describe "recognizes comments" $ do let text = "Foo\n x X\n-- | Hello\nBar\n name String" linesText = T.lines text it "T.lines" $ do linesText `shouldBe` [ "Foo" , " x X" , "-- | Hello" , "Bar" , " name String" ] let tokens = map tokenize linesText it "map tokenize" $ do tokens `shouldBe` [ [ Token "Foo" ] , [ Spaces 2, Token "x", Spaces 1, Token "X"] , [ DocComment "-- | Hello" ] , [ Token "Bar" ] , [ Spaces 1, Token "name", Spaces 1, Token "String" ] ] let filtered = filter (not . empty) tokens it "filter (not . empty)" $ do filtered `shouldBe` [ [ Token "Foo" ] , [ Spaces 2, Token "x", Spaces 1, Token "X"] , [ DocComment "-- | Hello" ] , [ Token "Bar" ] , [ Spaces 1, Token "name", Spaces 1, Token "String" ] ] let spacesRemoved = removeSpaces filtered it "removeSpaces" $ do spacesRemoved `shouldBe` [ Line { lineIndent = 0, tokens = ["Foo"] } , Line { lineIndent = 2, tokens = ["x", "X"] } , Line { lineIndent = 0, tokens = ["-- | Hello"] } , Line { lineIndent = 0, tokens = ["Bar"] } , Line { lineIndent = 1, tokens = ["name", "String"] } ] it "preparse" $ do preparse text `shouldBe` [ Line { lineIndent = 0, tokens = ["Foo"] } , Line { lineIndent = 2, tokens = ["x", "X"] } , Line { lineIndent = 0, tokens = ["-- | Hello"] } , Line { lineIndent = 0, tokens = ["Bar"] } , Line { lineIndent = 1, tokens = ["name", "String"] } ] it "preparse indented" $ do let t = T.unlines [ " Foo" , " x X" , " -- | Comment" , " -- hidden comment" , " Bar" , " name String" ] preparse t `shouldBe` [ Line { lineIndent = 2, tokens = ["Foo"] } , Line { lineIndent = 4, tokens = ["x", "X"] } , Line { lineIndent = 2, tokens = ["-- | Comment"] } , Line { lineIndent = 2, tokens = ["Bar"] } , Line { lineIndent = 4, tokens = ["name", "String"] } ] it "preparse extra blocks" $ do let t = T.unlines [ "LowerCaseTable" , " name String" , " ExtraBlock" , " foo bar" , " baz" , " ExtraBlock2" , " something" ] preparse t `shouldBe` [ Line { lineIndent = 0, tokens = ["LowerCaseTable"] } , Line { lineIndent = 2, tokens = ["name", "String"] } , Line { lineIndent = 2, tokens = ["ExtraBlock"] } , Line { lineIndent = 4, tokens = ["foo", "bar"] } , Line { lineIndent = 4, tokens = ["baz"] } , Line { lineIndent = 2, tokens = ["ExtraBlock2"] } , Line { lineIndent = 4, tokens = ["something"] } ] it "field comments" $ do let text = T.unlines [ "-- | Model" , "Foo" , " -- | Field" , " name String" ] preparse text `shouldBe` [ Line { lineIndent = 0, tokens = ["-- | Model"] } , Line { lineIndent = 0, tokens = ["Foo"] } , Line { lineIndent = 2, tokens = ["-- | Field"] } , Line { lineIndent = 2, tokens = ["name", "String"] } ] describe "empty" $ do it "doesn't dispatch comments" $ do [DocComment "-- | hello"] `shouldSatisfy` (not . empty) it "removes spaces" $ do [Spaces 3] `shouldSatisfy` empty describe "filter (not . empty)" $ do let subject = filter (not . empty) it "keeps comments" $ do subject [[DocComment "-- | Hello"]] `shouldBe` [[DocComment "-- | Hello"]] it "omits lines with only spaces" $ do subject [[Spaces 3, Token "indented"], [Spaces 2]] `shouldBe` [[Spaces 3, Token "indented"]] describe "removeSpaces" $ do it "sets indentation level for a line" $ do removeSpaces [[Spaces 3, Token "hello", Spaces 1, Token "goodbye"]] `shouldBe` [ Line { lineIndent = 3, tokens = ["hello", "goodbye"] } ] it "does not remove comments" $ do removeSpaces [ [ DocComment "-- | asdf" ] , [ Token "Foo" ] , [ Spaces 2, Token "name", Spaces 1, Token "String" ] ] `shouldBe` [ Line { lineIndent = 0, tokens = ["-- | asdf"] } , Line { lineIndent = 0, tokens = ["Foo"] } , Line { lineIndent = 2, tokens = ["name", "String"] } ] describe "associateLines" $ do let foo = Line { lineIndent = 0, tokens = pure "Foo" } name'String = Line { lineIndent = 2, tokens = "name" :| ["String"] } comment = Line { lineIndent = 0, tokens = pure "-- | comment" } it "works" $ do associateLines [ comment , foo , name'String ] `shouldBe` [ LinesWithComments { lwcComments = ["comment"] , lwcLines = foo :| [name'String] } ] let bar = Line { lineIndent = 0, tokens = "Bar" :| ["sql", "=", "bars"] } age'Int = Line { lineIndent = 1, tokens = "age" :| ["Int"] } it "works when used consecutively" $ do associateLines [ bar , age'Int , comment , foo , name'String ] `shouldBe` [ LinesWithComments { lwcComments = [] , lwcLines = bar :| [age'Int] } , LinesWithComments { lwcComments = ["comment"] , lwcLines = foo :| [name'String] } ] it "works with textual input" $ do let text = "Foo\n x X\n-- | Hello\nBar\n name String" parsed = preparse text allFull = skipEmpty parsed associateLines allFull `shouldBe` [ LinesWithComments { lwcLines = Line {lineIndent = 0, tokens = "Foo" :| []} :| [ Line {lineIndent = 2, tokens = "x" :| ["X"]} ] , lwcComments = [] } , LinesWithComments { lwcLines = Line {lineIndent = 0, tokens = "Bar" :| []} :| [ Line {lineIndent = 1, tokens = "name" :| ["String"]}] , lwcComments = ["Hello"] } ] it "works with extra blocks" $ do let text = skipEmpty . preparse . T.unlines $ [ "LowerCaseTable" , " Id sql=my_id" , " fullName Text" , " ExtraBlock" , " foo bar" , " baz" , " bin" , " ExtraBlock2" , " something" ] associateLines text `shouldBe` [ LinesWithComments { lwcLines = Line { lineIndent = 0, tokens = pure "LowerCaseTable" } :| [ Line { lineIndent = 4, tokens = "Id" :| ["sql=my_id"] } , Line { lineIndent = 4, tokens = "fullName" :| ["Text"] } , Line { lineIndent = 4, tokens = pure "ExtraBlock" } , Line { lineIndent = 8, tokens = "foo" :| ["bar"] } , Line { lineIndent = 8, tokens = pure "baz" } , Line { lineIndent = 8, tokens = pure "bin" } , Line { lineIndent = 4, tokens = pure "ExtraBlock2" } , Line { lineIndent = 8, tokens = pure "something" } ] , lwcComments = [] } ] it "works with extra blocks twice" $ do let text = skipEmpty . preparse . T.unlines $ [ "IdTable" , " Id Day default=CURRENT_DATE" , " name Text" , "" , "LowerCaseTable" , " Id sql=my_id" , " fullName Text" , " ExtraBlock" , " foo bar" , " baz" , " bin" , " ExtraBlock2" , " something" ] associateLines text `shouldBe` [ LinesWithComments { lwcLines = Line 0 (pure "IdTable") :| [ Line 4 ("Id" :| ["Day", "default=CURRENT_DATE"]) , Line 4 ("name" :| ["Text"]) ] , lwcComments = [] } , LinesWithComments { lwcLines = Line { lineIndent = 0, tokens = pure "LowerCaseTable" } :| [ Line { lineIndent = 4, tokens = "Id" :| ["sql=my_id"] } , Line { lineIndent = 4, tokens = "fullName" :| ["Text"] } , Line { lineIndent = 4, tokens = pure "ExtraBlock" } , Line { lineIndent = 8, tokens = "foo" :| ["bar"] } , Line { lineIndent = 8, tokens = pure "baz" } , Line { lineIndent = 8, tokens = pure "bin" } , Line { lineIndent = 4, tokens = pure "ExtraBlock2" } , Line { lineIndent = 8, tokens = pure "something" } ] , lwcComments = [] } ] it "works with field comments" $ do let text = skipEmpty . preparse . T.unlines $ [ "-- | Model" , "Foo" , " -- | Field" , " name String" ] associateLines text `shouldBe` [ LinesWithComments { lwcLines = Line { lineIndent = 0, tokens = "Foo" :| [] } :| [ Line { lineIndent = 2, tokens = pure "-- | Field" } , Line { lineIndent = 2, tokens = "name" :| ["String"] } ] , lwcComments = ["Model"] } ] describe "parseLines" $ do let lines = T.unlines [ "-- | Comment" , "Foo" , " -- | Field" , " name String" , " age Int" , " Extra" , " foo bar" , " baz" , " Extra2" , " something" ] let [subject] = parse lowerCaseSettings lines it "produces the right name" $ do entityHaskell subject `shouldBe` HaskellName "Foo" describe "entityFields" $ do let fields = entityFields subject it "has the right field names" $ do map fieldHaskell fields `shouldMatchList` [ HaskellName "name" , HaskellName "age" ] it "has comments" $ do map fieldComments fields `shouldBe` [ Just "Field\n" , Nothing ] it "has the comments" $ do entityComments subject `shouldBe` Just "Comment\n" it "combines extrablocks" $ do entityExtra subject `shouldBe` Map.fromList [ ("Extra", [["foo", "bar"], ["baz"]]) , ("Extra2", [["something"]]) ] describe "works with extra blocks" $ do let [_, lowerCaseTable, idTable] = parse lowerCaseSettings $ T.unlines [ "" , "IdTable" , " Id Day default=CURRENT_DATE" , " name Text" , "" , "LowerCaseTable" , " Id sql=my_id" , " fullName Text" , " ExtraBlock" , " foo bar" , " baz" , " bin" , " ExtraBlock2" , " something" , "" , "IdTable" , " Id Day default=CURRENT_DATE" , " name Text" , "" ] describe "idTable" $ do let EntityDef {..} = idTable it "has no extra blocks" $ do entityExtra `shouldBe` mempty it "has the right name" $ do entityHaskell `shouldBe` HaskellName "IdTable" it "has the right fields" $ do map fieldHaskell entityFields `shouldMatchList` [ HaskellName "name" ] describe "lowerCaseTable" $ do let EntityDef {..} = lowerCaseTable it "has the right name" $ do entityHaskell `shouldBe` HaskellName "LowerCaseTable" it "has the right fields" $ do map fieldHaskell entityFields `shouldMatchList` [ HaskellName "fullName" ] it "has ExtraBlock" $ do Map.lookup "ExtraBlock" entityExtra `shouldBe` Just [ ["foo", "bar"] , ["baz"] , ["bin"] ] it "has ExtraBlock2" $ do Map.lookup "ExtraBlock2" entityExtra `shouldBe` Just [ ["something"] ] persistent-2.10.5.2/LICENSE0000644000000000000000000000207513464141060013335 0ustar0000000000000000Copyright (c) 2012 Michael Snoyman, http://www.yesodweb.com/ Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. persistent-2.10.5.2/Setup.lhs0000755000000000000000000000016213464141060014136 0ustar0000000000000000#!/usr/bin/env runhaskell > module Main where > import Distribution.Simple > main :: IO () > main = defaultMain persistent-2.10.5.2/persistent.cabal0000644000000000000000000001120413621633057015515 0ustar0000000000000000name: persistent version: 2.10.5.2 license: MIT license-file: LICENSE author: Michael Snoyman maintainer: Michael Snoyman , Greg Weber synopsis: Type-safe, multi-backend data serialization. description: Hackage documentation generation is not reliable. For up to date documentation, please see: . category: Database, Yesod stability: Stable cabal-version: >= 1.10 build-type: Simple homepage: http://www.yesodweb.com/book/persistent bug-reports: https://github.com/yesodweb/persistent/issues extra-source-files: ChangeLog.md README.md flag nooverlap default: False description: test out our assumption that OverlappingInstances is just for String library if flag(nooverlap) cpp-options: -DNO_OVERLAP build-depends: base >= 4.9 && < 5 , aeson >= 1.0 , attoparsec , base64-bytestring , blaze-html >= 0.9 , bytestring >= 0.10 , conduit >= 1.2.12 , containers >= 0.5 , fast-logger >= 2.4 , http-api-data >= 0.3 , monad-logger >= 0.3.28 , mtl , path-pieces >= 0.2 , resource-pool >= 0.2.3 , resourcet >= 1.1.10 , scientific , silently , template-haskell , text >= 1.2 , time >= 1.6 , transformers >= 0.5 , unliftio-core , unliftio , unordered-containers , vector default-extensions: FlexibleContexts , MultiParamTypeClasses , OverloadedStrings , TypeFamilies exposed-modules: Database.Persist Database.Persist.Quasi Database.Persist.Types Database.Persist.Class Database.Persist.Sql Database.Persist.Sql.Util Database.Persist.Sql.Types.Internal other-modules: Database.Persist.Types.Base Database.Persist.Class.DeleteCascade Database.Persist.Class.PersistEntity Database.Persist.Class.PersistQuery Database.Persist.Class.PersistUnique Database.Persist.Class.PersistConfig Database.Persist.Class.PersistField Database.Persist.Class.PersistStore Database.Persist.Sql.Migration Database.Persist.Sql.Internal Database.Persist.Sql.Types Database.Persist.Sql.Raw Database.Persist.Sql.Run Database.Persist.Sql.Class Database.Persist.Sql.Orphan.PersistQuery Database.Persist.Sql.Orphan.PersistStore Database.Persist.Sql.Orphan.PersistUnique ghc-options: -Wall default-language: Haskell2010 test-suite test type: exitcode-stdio-1.0 main-is: test/main.hs build-depends: base >= 4.9 && < 5 , aeson , attoparsec , base64-bytestring , blaze-html , bytestring , containers , hspec >= 2.4 , http-api-data , path-pieces , scientific , text , time , transformers , unordered-containers , vector cpp-options: -DTEST default-extensions: FlexibleContexts , MultiParamTypeClasses , OverloadedStrings , TypeFamilies other-modules: Database.Persist.Class.PersistEntity Database.Persist.Class.PersistField Database.Persist.Quasi Database.Persist.Types Database.Persist.Types.Base default-language: Haskell2010 source-repository head type: git location: git://github.com/yesodweb/persistent.git persistent-2.10.5.2/ChangeLog.md0000644000000000000000000002041313621633057014504 0ustar0000000000000000# Changelog for persistent ## 2.10.5.2 * [#1041](https://github.com/yesodweb/persistent/pull/1041) * Explicit foreign keys can now reference tables with custom sql name * Add qualified names to the stock classes list. ## 2.10.5.1 * [#1024](https://github.com/yesodweb/persistent/pull/1024) * Add the ability to do documentation comments in entity definition syntax. Unfortunately, TemplateHaskell cannot add documentation comments, so this can't be used to add Haddocks to entities. * Add Haddock explainers for some of the supported entity syntax in `Database.Persist.Quasi` ## 2.10.5 * Add the `EntityWithPrefix` type to allow users to specify a custom prefix for raw SQL queries. [#1018](https://github.com/yesodweb/persistent/pull/1018) * Added Acquire based API to `Database.Persist.Sql` for working with connections/pools in monads which aren't MonadUnliftIO. [#984](https://github.com/yesodweb/persistent/pull/984) ## 2.10.4 * Log exceptions when closing a connection fails. See point 1 in [yesod #1635](https://github.com/yesodweb/yesod/issues/1635#issuecomment-547300856). [#978](https://github.com/yesodweb/persistent/pull/978) ## 2.10.3 * Added support for GHC 8.8 about MonadFail changes [#976](https://github.com/yesodweb/persistent/pull/976) ## 2.10.2 * Added `runMigrationQuiet` and `runMigrationUnsafeQuiet` to `Database.Persist.Sql.Migration` as safer alternatives to `runMigrationSilent`. [#971](https://github.com/yesodweb/persistent/pull/971) This functions as workaround/fix for: [#966](https://github.com/yesodweb/persistent/issues/966), [#948](https://github.com/yesodweb/persistent/issues/948), [#640](https://github.com/yesodweb/persistent/issues/640), and [#474](https://github.com/yesodweb/persistent/issues/474) * Added RawSql instances for 9, 10, 11 and 12-column results. [#961](https://github.com/yesodweb/persistent/pull/961) ## 2.10.1 * Added `constraint=` attribute to allow users to specify foreign reference constraint names. ## 2.10.0 * Added two type classes `OnlyOneUniqueKey` and `AtLeastOneUniqueKey`. These classes are used as constraints on functions that expect a certain amount of unique keys. They are defined automatically as part of the `persistent-template`'s generation. [#885](https://github.com/yesodweb/persistent/pull/885) * Add the `entityComments` field to the `EntityDef` datatype, and `fieldComments` fields to the `FieldDef` datatype. The QuasiQuoter does not currently know how to add documentation comments to these types, but it can be expanded later. [#865](https://github.com/yesodweb/persistent/pull/865) * Expose the `SqlReadT` and `SqlWriteT` constructors. [#887](https://github.com/yesodweb/persistent/pull/887) * Remove deprecated `Connection` type synonym. Please use `SqlBackend` instead. [#894](https://github.com/yesodweb/persistent/pull/894) * Remove deprecated `SqlPersist` type synonym. Please use `SqlPersistT` instead. [#894](https://github.com/yesodweb/persistent/pull/894) * Alter the type of `connUpsertSql` to take a list of unique definitions. This paves the way for more efficient upsert implementations. [#895](https://github.com/yesodweb/persistent/pull/895) ## 2.9.2 * Add documentation for the `Migration` type and some helpers. [#860](https://github.com/yesodweb/persistent/pull/860) ## 2.9.1 * Fix [#847](https://github.com/yesodweb/persistent/issues/847): SQL error with `putMany` on Sqlite when Entity has no unique index. ## 2.9.0 * Added support for SQL isolation levels to via SqlBackend. [#812] * Move `Database.Persist.Sql.Raw.QQ` to a separate `persistent-qq` package [#827](https://github.com/yesodweb/persistent/issues/827) * Fix [832](https://github.com/yesodweb/persistent/issues/832): `repsertMany` now matches `mapM_ (uncurry repsert)` and is atomic for supported sql back-ends. ## 2.8.2 * Added support for `sql=` to the unique constraints quasi-quoter so that users can specify the database names of the constraints. ## 2.8.1 * DRY-ed up and exposed several util functions in `Database.Persist.Sql.Util`. * Upstream-ed `updatePersistValue`, `mkUpdateText`, and `commaSeparated` from `Database.Persist.MySQL`. * De-duplicated `updatePersistValue` from various `Database.Persist.Sql.Orphan.*` modules. * Batching enhancements to reduce db round-trips. * Added `getMany` and `repsertMany` for batched `get` and `repsert`. * Added `putMany` with a default/slow implementation. SqlBackend's that support native UPSERT should override this for batching enhancements. * Updated `insertEntityMany` to replace slow looped usage with batched execution. * See [#770](https://github.com/yesodweb/persistent/pull/770) ## 2.8.0 * Switch from `MonadBaseControl` to `MonadUnliftIO` * Reapplies [#723](https://github.com/yesodweb/persistent/pull/723), which was reverted in version 2.7.3. ## 2.7.3.1 * Improve error messages when failing to parse database results into Persistent records. [#741](https://github.com/yesodweb/persistent/pull/741) * A handful of `fromPersistField` implementations called `error` instead of returning a `Left Text`. All of the implementations were changed to return `Left`. [#741](https://github.com/yesodweb/persistent/pull/741) * Improve error message when a SQL insert fails with a custom primary key [#757](https://github.com/yesodweb/persistent/pull/757) ## 2.7.3 * Reverts [#723](https://github.com/yesodweb/persistent/pull/723), which generalized functions using the `BackendCompatible` class. These changes were an accidental breaking change. * Recommend the `PersistDbSpecific` docs if someone gets an error about converting from `PersistDbSpecific` ## 2.7.2 [DEPRECATED ON HACKAGE] * Many of the functions have been generalized using the `BackendCompatible` class. [#723](https://github.com/yesodweb/persistent/pull/723) * This change was an accidental breaking change and was reverted in 2.7.3. * These change will be released in a future version of Persistent with a major version bump. * Add raw sql quasi quoters [#717](https://github.com/yesodweb/persistent/pull/717) ## 2.7.1 * Added an `insertUniqueEntity` function [#718](https://github.com/yesodweb/persistent/pull/718) * Added `BackendCompatible` class [#701](https://github.com/yesodweb/persistent/pull/701) ## 2.7.0 * Fix upsert behavior [#613](https://github.com/yesodweb/persistent/issues/613) * Atomic upsert query fixed for arithmatic operations [#662](https://github.com/yesodweb/persistent/issues/662) * Haddock and test coverage improved for upsert ## 2.6.1 * Fix edge case for `\<-. [Nothing]` * Introduce `connMaxParams` * Add 'getJustEntity' and 'insertRecord' convenience function * Minor Haddock improvment ## 2.6 * Add `connUpsertSql` type for providing backend-specific upsert sql support. ## 2.5 * read/write typeclass split * add insertOrGet convenience function to PersistUnique ## 2.2.4.1 * Documentation updates [#515](https://github.com/yesodweb/persistent/pull/515) ## 2.2.4 * Workaround for side-exiting transformers in `runSqlConn` [#516](https://github.com/yesodweb/persistent/issues/516) ## 2.2.3 * PersistField instance for Natural * better oracle support in odbc ## 2.2.2 * Add liftSqlPersistMPool function * support http-api-data for url serialization ## 2.2.1 * Migration failure message with context * Fix insertKey for composite keys ## 2.2 * Add a `RawSql` instance for `Key`. This allows selecting primary keys using functions like `rawSql`. [#407](https://github.com/yesodweb/persistent/pull/407) * SqlBackend support for an optimized `insertMany` ## 2.1.6 Important! If persistent-template is not upgraded to 2.1.3.3 you might need to make sure `Int64` is in scope for your model declarations. * add showMigration function * explicitly use Int64 for foreign key references ## 2.1.5 Add `dbIdColumnsEsc` to Sql.Utils. Used in persistent-postgresql 2.1.5.2 ## 2.1.4 * Fix getBy with a primary key. #342 ## 2.1.3 * Break self-referencing cycles in the entity declarations ## 2.1.2 * Error with `Double`s without a decimal part [#378](https://github.com/yesodweb/persistent/issues/378) * `runSqlPool` does not perform timeout checks. ## 2.1.1.6 * One extra feature for #939: use `logDebugN` instead ## 2.1.1.5 * Better SQL logging [Yesod issue #939](https://github.com/yesodweb/yesod/issues/939) ## 2.1.1.3 Parse UTCTime in 8601 format [#339](https://github.com/yesodweb/persistent/issues/339) ## 2.1.1.1 Support for monad-control 1.0 persistent-2.10.5.2/README.md0000644000000000000000000000032013464141060013576 0ustar0000000000000000## persistent Type-safe, data serialization. You must use a specific backend in order to make this useful. For more information, see [the chapter in the Yesod book](http://www.yesodweb.com/book/persistent).