shake-0.19.8/0000755000000000000000000000000007346545000011075 5ustar0000000000000000shake-0.19.8/CHANGES.txt0000644000000000000000000010755707346545000012725 0ustar0000000000000000Changelog for Shake (* = breaking change) 0.9.8, released 2024, 2024-01-14 Test with GHC 9.8 #844, optimise database reading/writing with use unsafeUseAsCString #836, add threaded flag to disable using threaded runtime #837, require filepath-1.14 0.9.7, released 2022-09-15 #815, improve corrupt database handling code Don't say (changed) for files that don't build 0.19.6, released 2021-09-07 #810, don't hash files in &%> if you don't have to 0.19.5, released 2021-07-04 #807, fix space leak in Database module #796, fix a bug with newCache dependencies #798, optimise dependency checking 0.19.4, released 2021-01-14 #790, add option shakeAllowRedefineRules * #790, remove overrideBuiltinRule 0.19.3, released 2021-01-14 #789, add overrideBuiltinRule #787, more documentation on doesFileExist 0.19.2, released 2020-11-15 #780, Autodeps should consider a rename as a write to the destination #778, AutoDeps shouldn't trigger for files read and written #779, merge dependencies in O(n and a bit) #779, merge local traces in O(n) #768, the embed-files flag work on the executable too 0.19.1, released 2020-06-06 #757, make sure shared cache writes are atomic Remove a small space leak if using the Database module #755, add instance IsCmdArgument (NonEmpty String) #754, tighten bounds to exclude GHC 7.10 0.19, released 2020-05-23 #738, improve the help text #679, allow Ninja to depend on a directory * #748, close stdin by default in cmd #748, add NoProcessGroup and InheritStdin to cmd Delete stuff deprecated since 2014 - **>, ?>>, *>>, *>, |*>, &*> Mark askOracleWith and deprioritize as DEPRECATED #753, fix timing information for batch #746, optimise file modification times on Linux #746, add an instance for CmdArgument () Allow getting FSATrace ByteString as a result * Make the FSATrace type polymorphic over the FilePath Speed up FSATrace parsing Require extra-1.6.19 0.18.5, released 2020-02-02 Use uninterruptibleMask_ to ensure all cleanup happens robustly #742, make the Chrome profile only include the last run #740, make CmdArgument an instance of IsCmdArgument 0.18.4, released 2019-12-15 #734, add Forward.cacheActionWith #734, make forward fail if shakeLintInside is empty * #701, make shakeSymlink=False the default #694, embed data files in the binary (use embed-files Cabal flag) #722, don't close file handles in cmd unless CloseFileHandles #727, add actionBracket for bracket-like operations #720, add --share-sanity to sanity check shared cache contents #719, write history key after cached files to ensure it is coherent #703, print rebuild warnings in yellow and errors in red #706, rename Verbosity's constructors and add Warn #443, add getEnvError #662, add Partial to command/cmd #708, rename deprioritize to reschedule #689, allow adding help messages at the end using addHelpSuffix #686, rename Build.hs to Shakefile.hs #690, make Ninja mode use -j0 by default (more like Ninja) 0.18.3, released 2019-07-01 Add shakeSymlink to enable/disable symlinks in the shared cache Improve cmd on async exceptions with nested processes on Windows 0.18.2, released 2019-05-19 #678, fix serious bug in writeFileChanged 0.18.1, released 2019-05-19 #678, fix serious bug in writeFile', deleting the wrong file 0.18, released 2019-05-14 * Make files copied to the shared cache read-only Delete files before writing, giving symlink/readonly safety #677, make --help work even if the build system throws errors * Use parallelism for Applicative adjacent needs Support GHC 8.8 0.17.9, released 2019-05-01 #675, allow --compact=yes|no|auto Make lintTrackAllow work after the lintTrackRead/Write * Merge successive Cwd declarations to cmd #671, change the file format to permit rsync of caches Speed up lint tracking #670, make cloud values more portable between machines #666, exceptions in --share-list, --share-remove are warnings 0.17.8, released 2019-04-02 Eliminate a corner case of losing exception messages Make FSATrace available as a result type Commands which have AutoDeps aren't also linted #278, add StdoutTrim - like Stdout, but with trim applied #659, require base >= 4.8 0.17.7, released 2019-03-18 Add back -B as an alias for --rebuild Make --help say which other options have changed Add --lint-ignore flag to set shakeLintIgnore Improve the documentation for produces, call trackWrite Make trackWrite/trackRead respect shakeLintIgnore Make --lint-fsatrace also add to shakeLintInside #656, make --lint-fsatrace work with relative directories Make sure --no-build does not destroy previous profile output #437, #414, #298 rewrite the profile view #588, #521 add end times and traced top-levels to the profile Add askOracles Generate nested dependency information in profiles Don't add .exe add the end of traced cmd calls Change the profiling .json format #651, remove support for GHC 7.4, 7.6 and 7.8 0.17.6, released 2019-02-17 Make --progress on Windows work with non-ASCII characters #18, add --compact for a Bazel/Buck style output Make the progress message display a more traditional count #638, add shakeTrace to record start/stop of traced commands Show user-added options separately #625, make --help show the list of targets #625, add addTarget/getTargets to track the available targets #416, generalise Forward.cacheAction to be in label and output #642, detect files that change with --lint-watch #649, add --share-list and --share-remove to work with the cache #626, improve the display of error messages with call stacks Make sure exceptions don't change based on verbosity setting Make sure all thread pool threads terminate before shake returns #646, change the identity property to require a Maybe Use symlinks for --shared #643, add MonadFail Rules 0.17.5, released 2019-02-05 Remove dependency on heaps 0.17.4, released 2019-01-10 Add shakeProfileDatabase to generate profile information Don't suggest -qb (irrelevant with -qg) 0.17.3, released 2018-12-04 #632, remove O(n^2) behaviour when constructing user rules In diagnostic mode, show the number of actions/rules/user rules 0.17.2, released 2018-11-29 Make needHasChanged work even between runs 0.17.1, released 2018-11-14 Add actionCatch, to catch exceptions in Action #622, avoid potential maxBound overflow in batch Reduce the context required for addBuiltinRule 0.17, released 2018-10-15 Add Database module for repeated execution from one database * Don't export the body of the Typeable type class Mark deprecated operators as DEPRECATED Improve async exception safety in some corner cases Support shakeFiles=/dev/null to avoid any database Add reprioritise to change the priority of a running rule Add a dependency on heaps package #611, fix a race condition in actionFinally that reran handlers Remove a potential space leak in the Rules monad #412, clarify that removeFilesAfter should be used sparingly #409, improve the documentation of newCache Generalise operations like versioned to Rules a -> Rules a Make the Shake database version string contain os and arch Don't delete the whole database when an oracle disappears #300, soften the database version change message IMPORTANT: Incompatible on disk format change #590, print "Build completed" timing with better accuracy #581, improve the information and display of call stacks Add actionRetry to retry actions if they fail #143, give locations when several rules match Compatibility with unix-2.8 Add shakeShare, --share and history functions for shared builds #574, extensive documentation of Development.Shake.Rule * Change how alternative and priority rules interact Add versioned for rules * Change all the ways of obtaining user rules, see getUserRuleOne Improve time recording for unsafeExtraThread, parallel, batch #553, improve file-hash performance when nothing changes * Require addBuiltinRule to take a BuiltinIdentity argument * Make the third argument to BuiltinRun a real ADT Disable all the lintTrack functions unless lint is enabled * Change the Development.Shake.Rule.track* functions to lintTrack* #584, add Semigroup/Monoid instances for Action Add produces to declare which untracked files a rule produces Add historyDisable to note which rules cannot be cached #583, add replaceDirectory1 Add addOracleHash for oracles with reduced storage requirements 0.16.4, released 2018-04-04 #185, add addOracleCache which doesn't always rerun #576, remove incorrect Cabal description #575, print --help message at Quiet verbosity or above 0.16.3, released 2018-02-23 Allow reading data files from next to the executable #566, require extra-1.6.1 #565, deal with multi-file rules that don't create their contents Improve the documentation of getDirectoryContents 0.16.2, released 2018-02-13 Set the correct lower bound on unordered-containers of 0.2.7 0.16.1, released 2018-02-11 Support process-1.6.3.0 #290, add withTempFileInside and withTempDirInside #549, Semigroup-Monoid proposal support Hadrian-462, fail gracefully if on-disk formats change #469, only print timing information at normal verbosity or above #340, add runAfter #372, deprecate askOracleWith #520, only record each dependency once #548, rebuild the Ninja files, if they change, restart #547, make Ninja --lint checking see through phonys #546, support Ninja line continuations in more places #470, make --verbose and --quiet help text clearer #534, make writeFile' and writeFileChanged create the directory #543, add resultHasChanged to track results which have changed Add getShakeExtraRules, like getShakeOptionsRules for shakeExtra #535, add shakeArgsOptionsWith to manipulate ShakeOptions #538, make sure processes can be killed properly #502, add needHasChanged to track what in a need has changed Weaken the typeclasses required for apply/apply1 #502, add batch function to batch commands with high startup Avoid parallel duplicating local state such as discounting Prefer to resume tasks rather than start new ones 0.16, released 2017-09-20 #536, make --skip work for oracles Ensure shakeOutput is used more consistently #49, add shakeColor and --color flags #490, recommend -threaded as standard #517, ignore ./ in FilePattern Require extra-1.5.3 #499, add a filePattern function, like ?== but with the matches #474, never spawn user actions unmasked Allow user arguments to replace builtin arguments #522, make copyFile create directories if necessary #516, add an example for withTempDir #514, expose more about cmd arguments #523, #524, make sure phony doesn't run its dependencies first #515, add cmd_ function #506, allow duplicate type names in different modules #503, require shakeExtra to obey the sensible invariants #503, add getShakeExtra/addShakeExtra #492, fix the single letter flag documentation Expose 'Process' from Development.Shake #495, remove dangling link from LICENSE #436, remove Assume, switch to Rebuild Remove --always-make, use --rebuild instead #419, remove --assume-old and --assume-new, which never worked Remove support for running Makefile scripts Add getShakeOptionsRules, to get ShakeOptions in Rules #479, improve the robustness of the Pool tests #481, document how to raise errors in Action Delete the deprecated system* functions #427, check stored value after checking dependencies Significant changes to defining custom rules Delete the deprecated defaultRule IMPORTANT: Incompatible on disk format change #428, don't persist errors to the database 0.15.11, released 2017-01-18 #488, make sure parallel tracks dependencies #513, permit process-1.4.3.0 and above 0.15.10, released 2016-06-14 #465, fix phony names which clash with directories 0.15.9, released 2016-06-12 Documentation tweaks Optimise the thread pool Fix progress output through wget 0.15.8, released 2016-06-01 Extra base bounds to rule out building on GHC 7.4 0.15.7, released 2016-06-01 Require extra-1.4.8 #457, better error message when calling need on a directory #456, use displayException for inner exceptions Improve the documentation for priority/alternative #448, fix FSATrace on Mac 10.11 #444, optimise copyFileChanged #58, change writeFile functions to MonadIO 0.15.6, released 2016-04-19 Require the primitive package #431, make it an error to need a directory #393, add orderOnlyAction, a general version of orderOnly #408, explain that alwaysRerun has no immediate effect #410, add --digest-not flag to select ChangeModtime #416, export ShakeValue from the main module only #352, improve the grammar in the profiling document #399, explain what in getDirectoryFiles is tracked #108, add getDirectoryFilesIO, an untracked version #401, document getDirectoryFiles exception behaviour #401, fix getDirectoryFiles error on pattern directory missing #400, detect indirect recursion in build rules #402, give better errors on recursive rules #369, clarify the docs for putQuiet/putLoud #405, make sure phony targets with \ work on Windows Require extra-1.4.3 #398, only require old-time with GHC 7.4 #239, make Rules an instance of MonadIO #384, add shakeExtra field to ShakeOptions for user data #374, make --debug=file write out newlines #308, make sure Shell and AutoDeps work together #365, fix unicode output with putNormal #269, add parallel execution via parallel/forP/par #310, preserve quoting around cmd #333, optimise the random pool, up to 20% faster #335, add deriving Read instances for more types Add Development.Shake.Forward for forward-defined systems #336, export :-> #337, fix the docs for &%> #297, make run the query profile viewer #305, make profiling work with newer Chrome #327, Add AutoDeps feature #325, add FileStdin to specify the stdin comes from a file Make an empty Stdin different from no Stdin (inherit) in cmd #320, add ** as a relative-only alternative to // in patterns #283, change the semantics of removeFiles, no implicit deletes #283, speed up getDirectoryFiles on patterns involving literals Work with stack test Add RemEnv cmd option Fix a small space leak on Linux Fix a space leak in profile summary information Properly handle *//* in file pattern (at least one directory) #303, allow ninja rule names to have dots in them Properly handle /// in file pattern (a wildcard then /) Make sure Windows separators work in |%> FilePattern rules #296, fix multifile rules when using --digest-and-input #286, document the relationship between getEnv and lookupEnv #285, improve the documentation for phony/alwaysRerun #284, fix the documentation for normaliseEx #293, add getProgress #270, support MonadFix for Rules, allows recursive cache #295, discount time spent waiting for a cache #294, give better error messages if you hit GHC bug #10793 #4, add a lock file to stop corrupting databases 0.15.5, released 2015-08-04 #279, make usingConfigFile do a need on the config file Fix a bug where predicted progress could sometimes be ?? #264, make the the suite more less non-deterministic 0.15.4, released 2015-06-20 Undo a locally modified file 0.15.3, released 2015-06-20 #254, in staunch mode, print out all exceptions Require extra-1.3 #259, discount time waiting for a resource in profiles #252, have the docs test configure not in dist 0.15.2, released 2015-05-22 #248, add another example of using cmd #245, initial support for fsatrace lint checking Reexport -<.> from filepath where available Hoogle #106, trigger on filepath version, not GHC version Add AddEnv and AddPath command options #243, close fds in child processes when spawning commands Make Ninja variable handling more accurate 0.15.1, released 2015-04-25 If you have Shakefile.hs, pass it all arguments without interp Add shakeArgsPrune and shakeArgsPruneWith #228, allow running cmd async by collecting the ProcessHandle Make getShakeOptions/processorCount of 0 return the used value #212, document how to get a full terminal with cmd #225, warn if there are no want/action statements #232, don't ignore phony order-only dependencies in Ninja #226, add escaping for GraphViz labels #227, add StdinBS for passing a bytestring as Stdin Make cmd Timeout call terminateProcess as well 0.15, released 2015-03-26 #203, make shakeFiles a directory rather than a file prefix #220, add getHashedShakeVersion helper #220, add shakeVersionIgnore to ignore version numbers #219, run Shakefile.hs from the shake binary #218, fix issues with incorrect unchanging with no digests #218, fix issue with ChangeModtimeAndDigest on unchanging files #216, work around GHC 7.10 RC3 bug 10176 #213, add phonys, a predicate phony rule Add CmdTime and CmdLine results to cmd/command Fix parseMakefile for words with multiple escapes in them #205, add WithStdout, like WithStderr #27, add support for capturing Stdout/Stderr with bytestrings Add FileStdout/FileStderr to write a stream direct to a file #211, add Stdouterr to capture both Stdout and Stderr streams Require extra-1.1 (to use nubOrd) Generalise cmd to work with Maybe [String] Add unit for use with cmd IMPORTANT: Incompatible on disk format change #209, improve orderOnly dependencies 0.14.3, released 2015-01-07 Support for the filepath shipped with GHC 7.10 Add Timeout option to command 0.14.2, released 2014-11-27 #198, add operator to join FilePatterns #198, fix the <.> and other extension methods to work with // 0.14.1, released 2014-11-21 #196, change the links to point at http://www.shakebuild.com/ Improve the error messages when &%> or &?> go wrong Normalise file patterns used by &?> 0.14, released 2014-11-19 Make FilePattern on Windows cope with all path separators Rename normalise to normaliseEx Expose native System.FilePath in Development.Shake.FilePath #193, eliminate the custom version of #193, make need/orderOnly call normalise on their arguments #177, use the $PATH variable to resolve cmd #182, use js-jquery and js-flot #195, add getConfigKeys #194, replace *> with %> as the operator for rules #188, improve the docs on FilePattern #187, removeFiles on a missing directory should do nothing #166, add withTempFile and withTempDir in the Action monad Ensure Ninja response files are always deleted Fix incorrect units when showing profile to the console Require the extra library #178, make --help show the current executable name 0.13.4, released 2014-09-19 #171, fix the --demo mode on Linux 0.13.3, released 2014-09-16 Ensure you wait until the progress thread cleans up Add --demo mode Add --progress=record and --progress=replay Add -j to run with one thread per processor Fix progress reporting to work even on the first run Apply abbreviations to staunch failure messages Fix failure progress messages during staunch mode #165, make |*> matching with simple files much faster Add toStandard to the FilePath module #130, support msvc_deps_prefix in Ninja #157, allow variable substitution in include/subninja statements #147, add the version in the profile reports #147, add --numeric-version flag #28, switch to continuation passing style, uses far less threads #156, add readConfigFileWithEnv for configs with an initial env #129, fix processor count detection on Mac with -threaded #155, add --live and shakeLiveFiles #153, handle escaped spaces in parseMakefile #152, fix typo in the manual Add copyFileChanged Fix the leaf column in profile reports Add Maybe instances for cmd arguments Add a --profile flag (alias for --report) Switch to an underlying continuation monad 0.13.2, released 2014-06-20 #95, ensure progress never gets corrupted #124, add a profile report demo #128, allow long Ninja command lines Fix --report=- for builds with no commands in them 0.13.1, released 2014-05-28 Remove all package upper bounds #126, Ninja compatibility if Ninja fails to create a file #123, generate Chrome compatible traces 0.13, released 2014-05-27 #122, make --report=- write a report to stdout Improve the profile report summary #122, turn shakeReport into a list of files, instead of a Maybe #60, improve how command lines are printed out #113, print info about copyFile' and removeFilesAfter at -V Replace **> with |*> , ?>> with &?> and *>> with &*> IMPORTANT: Incompatible on disk format change #83, support digest rules for files #83, add shakeChange parameter and --digest* args #83, add equalValue function to Rule typeclass Deprecate defaultRule Make literal *> matches take precedence over wildcard matches #120, add a priority function 0.12, released 2014-05-09 #62, move to a ReaderT/IORef for the Action monad Add DEPRECATED pragmas on system' calls Delete Development.Shake.Sys, use command or cmd instead Add a 'console' pool to Ninja Avoid using System.Cmd (deprecated in GHC HEAD) #41, use higher precision file times on POSIX #117, use higher precision times for Unicode files on Windows #118, add support for Ninja -t compdb #119, more test fixes for Linux GHC 7.8 0.11.7, released 2014-05-05 #119, test fixes for Linux GHC 7.8 0.11.6, released 2014-05-04 #114, run build.sh in the test suite with sh #115, fix the test suite on Linux GHC 7.8 #116, fix for GHC 7.10 0.11.5, released 2014-04-25 Include the sources for flot and jQuery in the dist tarball Note that .. in FilePattern values is unlikely to be correct #109, make removeFiles delete directories that match #84, make removeFiles not delete empty unmatched directories #111, fixes to the user manual #110, don't give a warning if -threaded is turned off #103, don't suggest -qg and -qb, they fail without -threaded #102, fix up when the LHS starts with . 0.11.4, released 2014-03-20 Work with QuickCheck-2.7 (which defines ===) #100, handle GraphViz missing gracefully Fix up the profiling report generation #99, add getEnvWithDefault 0.11.3, released 2014-02-27 #97, fix a serialisation bug when > 254 arguments to need 0.11.2, released 2014-02-18 #96, fix a bug in addPath that caused $PATH to be added twice 0.11.1, released 2014-02-06 #94, GHC 7.8 support Add a Config module #89, support :: as a build rule separator 0.11, released 2014-01-03 Add alternatives to allow overlapping rules Make storedValue take a ShakeOptions structure Generalise the newCache function Improve the performance of the Ninja parser Make the database more compact #84, ensure you normalise removeFile patterns first #82, make -j0 guess at the number of processors #81, add --lint-tracker to use tracker.exe Add trackRead, trackWrite Add trackUse, trackChange, trackAllow #85, move rule creation functions into Development.Shake.Rule Mark Development.Shake.Sys as DEPRECATED with a pragma Change shakeLint to be of type Maybe Lint, instead of Bool #50, add shakeArgsAccumulate 0.10.10, released 2013-11-16 Improve Ninja --lint checking 0.10.9, released 2013-11-15 #76, add Ninja specific lint checking #75, add orderOnly dependencies #76, add needed, to assert a need doesn't rebuild #78, don't use cabal in the test suite #77, add shakeRunCommands and --skip-commands #67, add withVerbosity #51, add getShakeOptions Lots of Haddock improvements Deprecate system', use cmd or command instead #53, add addPath and addEnv to modify environment variables Make all search path things return the native separators #73, if Haskell can't write a unicode filename, skip the test Print out the entire database in diagnostics mode Rewrite database writes, old code could break metadata on error #25, optimise building with Ninja Fix many bugs in FilePath.normalise Require QuickCheck-2.0 or higher Change how the makefile parser treats \ characters Add a --no-build flag #74, allow cmd to be used with result IO Add a Util module, with Makefile parsing 0.10.8, released 2013-10-15 Allow unix-1.5.1 again Require Cabal 1.10 or above Convert to the cabal test compatible test suite 0.10.7, released 2013-09-26 Support time-1.2 #71, fix a bug when requesting all three outputs from command #64, add progressProgram and use it in progressSimple Remove isRunning, kill the progress thread on completion #47, improve the performance of ?==, especially on "//*" #68, improve the docs for addOracle #55, ensure if you need phony targets you rebuild every time #52, ensure all command output is printed #20, document preferred compilation flags #20, speed up the Shake executable with GHC RTS flags #39, print out more of the version string on database change #41, require unix-2.6 or above #48, make it clear filetimes are hashes #43, improve the error messages on lint failures #45, avoid use of nub #45, avoid use of intersect which is O(n^2) #26, add newThrottle to create throttling resources #26, add unsafeExtraThread to run more than shakeThreads #32, add a check that withResources does not call need #614, support unicode filenames Require unordered-containers-0.2.1 or above Reduce stack usage when lint checking #24, create the output directories for the multiple file rule Improvements to match Ninja behaviour 0.10.6, released 2013-06-28 Include command/cmd in the default module #16, change the scoping in subninja #15, fix up the Ninja default targets Fix up --version output 0.10.5, released 2013-06-14 Improve progress prediction for the first build Fix removeFiles when there is a directory argument and a pattern Delete shakeDeterministic, shakeThreads=1 has the same effect 0.10.4, released 2013-06-13 Fix writeFileChanged for line endings on Windows Support arguments to --progress Set the number of capabilities based on -j flags Add shakeTimings, to time the separate stages Add AssumeSkip, mostly for benchmarking Normalise file arguments before calling want Expose ShakeException Add isFailure to ShakeProgress, and display in progressDisplay Add withResources Add -<.> alias for replaceExtension Add selectors on Stdout/Stdin/Exit Add CmdResult instance for ExitCode 0.10.3, released 2013-05-12 Upgrade to Flot-0.8 Small documentation markup fixes 0.10.2, released 2013-03-30 Require process-1.1 or above Improve progress guesses (use exponential decay of work rate) Improve profiling report error messages Add a Command module, with alternatives to system' etc. #593, don't assume doesDirectoryExist = not . doesFileExist #594, swap how traced messages are printed out Add Chatty verbosity, for what was Loud Make repeated --verbose keep increasing the verbosity If shakeThreads==1 then always use a deterministic pool Add shakeLineBuffering, defaulting to True Improve the performance of profiling Documentation and corrections improvements Eliminate StepKey from the profiling output Add a --no-time flag Makefile improvements Avoid use of ghc-options in .cabal file 0.10.1, released 2013-03-18 Allow the shake executable to build 0.10, released 2013-03-17 Add phony, for writing phony actions Add removeFilesAfter Switch to the new profiling code Add actionOnException and actionFinally Add shakeArgsWith Rename shakeWithArgs to shakeArgs and change the clean argument Remove the -f alias for --flush Add a shake executable which runs Makefiles Rename shake to shake-test Change how progress messages are written Do not write a beep in progressDisplay, avoids a beep Add exe function to FilePath 0.9.1, released 2013-02-28 GHC head (7.7) compatibility by removing the Rules fundep 0.9, released 2013-02-16 Make the tests work on Linux Fix report generation on machines whose text format is not ASCII Make Directory functions work the same on Linux/Mac Change shakeVersion to be a String Stop alwaysRerun causing lint failures Make shakeLint check that the current directory does not change Add shakeOutput setting Add removeFiles function Add Sys module for writing consise system calls Add getEnv function Add shakeWithArgs and shakeOptDescrs Add newCache to cache the parsed contents of files Add newResourceIO and move newResource into the Rules monad Add shakeStorageLog, to help diagnosing obscure database errors 0.8, released 2013-01-31 Improve the Oracle documentation Allow getDirectoryFiles to operate recursively 0.7, released 2013-01-29 Change getDirectoryFiles to take a set of file patterns Add doesDirectoryExist function Add shakeAbbreviations feature Add a new Progress module for sensible progress messages Spawn shakeProgress on a separate thread, a safer default Improve the runtime error messages Add a quietly function, to hide traced commands Print main status messages when running a traced command Display the exit code when a system command fails Fix AssumeClean when the result exists but has never been built IMPORTANT: Incompatible on disk format change Change the storage to not write messages on --silent Add Applicative on Rules Add Applicative on Action 0.6, released 2013-01-09 Make addOracle return a result of askOracle Export the class bodies from Classes Don't export ShakeValue from Classes 0.5, released 2013-01-05 IMPORTANT: Incompatible on disk format change Add the ShakeValue constraint synonym Change the Oracle to be strongly typed Add a Classes module 0.4, released 2013-01-02 Add shakeFlush to control how often flush is called Fix a serious space leak in the thread pool #502, add shakeAssume to assume files are clean/dirty Remove validStored, replace with storedValue Remove the default validStored class, almost never right Remove Eq/Ord from ShakeOptions (now contains a func) #501, add statistics gathering functions to help report progress Ensure registering witnesses is thread safe Ensure GHC does not over-optimise type registration 0.3.10, released 2012-12-17 Add Paths.hs so you can run the tests from the tarball 0.3.9, released 2012-12-17 Don't rely on toList returning a consistent order Allow hashable-1.2 0.3.8, released 2012-12-12 Fix up FilePattern so "foo//bar" ?== "foo/bar" 0.3.7, released 2012-12-02 Update the cabal file 0.3.6, released 2012-11-27 Add addOracles, for implementing more advanced oracles Add withoutActions, for implementing command line arguments 0.3.5, released 2012-11-27 #571, vastly improve the correctness of FilePattern #574, documentation typos Expose rulePriority 0.3.4, released 2012-09-17 Update documentation with links to ICFP 2012 paper/talk 0.3.3, released 2012-09-17 Minor refactorings 0.3.2, released 2012-09-12 Fix cabal specification on non-Windows 0.3.1, released 2012-09-08 Don't use syntax also stolen by QuasiQuotes Avoid warnings about unused monadic values Fix up getModificationTime for GHC 7.6 Don't assume catch is exported by Prelude (GHC 7.6 compat) Improve the error message when failing to build with *>> or ?>> Introduce a portable flag, to switch to portable FileTime ops Improve the error message when failing to build a file Ensure errors raised bubble up quickly Significant improvements to the profile output Only trace the program name by default, not the entire command Allow unordered-containers < 0.2 again Intern all keys, so they are only stored once Optimise modification time checking on Unix Optimise modification time checking on Windows Make some fields smaller on disk IMPORTANT: Incompatible on disk format change Switch to ByteString for storing File Add shakeDeterministic, along with a default random pool Make the Monad/Monoid instance for Rules strict 0.3, released 2012-06-17 #550, ensure the journal thread is locked #550, if your thread dies while reading the database then die #550, ensure you reset to before the slop before writing If the witness table is completely corrupt give a better error Make storage work with exceptions whose messages raise errors Journal error conditions, to ensure they don't repeat Better messages on invalid database versions Complete rewrite of the storage layer and journal design IMPORTANT: Incompatible on disk format change 0.2.11, released 2012-05-19 IMPORTANT: #546, don't save the database to where it was created 0.2.10, released 2012-05-19 Add systemCwd for running with a particular current directory Ensure the database is saved to where it was created #545, make sure if shake is killed, it kills its children 0.2.9, released 2012-03-25 Require unordered-containers 2.1 or higher, fixes bugs Allow transformers-0.3.* Add a random script generator and tester Improve the documentation for withResource Fix the unordered-containers constraint, was too loose Don't write report generation messages in Quiet mode Add ?>> operator, a generalisation of *>> 0.2.8, released 2012-02-24 Rename shakeDump to shakeReport Add profile report generator 0.2.7, released 2012-02-18 #530, require unordered-containers >= 0.1.4.3 0.2.6, released 2012-02-17 Improve the documentation code fragments (more links) Add support for managing finite resources 0.2.5, released 2012-02-15 Require hashable-1.1.2.3, which has a TypeRep instance Add Data/Typeable instances for ShakeOptions Add Data instance for Verbosity 0.2.4, released 2012-02-11 Include the C example source files Significant documentation improvements 0.2.3, released 2012-02-09 Create the shakeFiles directory if missing 0.2.2, released 2012-02-05 Allow deepseq-1.3.* Add a basic lint checking mode Remove the Dirty state entirely (was incorrect) 0.2.1, released 2012-01-28 Put diagnostics in more places Add a C example 0.2, released 2012-01-23 Add shakeStaunch Rename shakeParallel to shakeThreads Delete the lint support (never actually worked) Completely rewrite the central build algorithm Add verbosity Diagnostic Improve FilePath.normalise Add writeFileChanged Add systemOutput Add an Oracle module Add an explicit Verbosity type Change to lower precedence for *> etc 0.1.5, released 2012-01-01 Allow deepseq-1.2 0.1.4, released 2012-01-01 Don't export currentRule/currentStack Fix **>, was matching the wrong way round Fix FilePath normalise so it works properly on Windows Properly detect recursive rules, instead of looping Add *>> for building multiple files at once 0.1.3, released 2011-12-22 Fix a bug where a file rule could return with a lazy error Make sure all the files are stored in binary, not text Fix a pattern match error in getDirectoryFiles 0.1.2, released 2011-12-18 Add a warning in the description 0.1.1, released 2011-12-18 Significantly improved documentation system' now takes an initial argument for the program 0.1, released 2011-12-18 Many changes to signatures, some basic documentation Additional demo, to build shake itself 0.0, released 2011-12-11 Initial version, not ready for public use shake-0.19.8/LICENSE0000644000000000000000000000276407346545000012113 0ustar0000000000000000Copyright Neil Mitchell 2011-2024. All rights reserved. Redistribution and use in source and binary forms, with or without modification, are permitted provided that the following conditions are met: * Redistributions of source code must retain the above copyright notice, this list of conditions and the following disclaimer. * Redistributions in binary form must reproduce the above copyright notice, this list of conditions and the following disclaimer in the documentation and/or other materials provided with the distribution. * Neither the name of Neil Mitchell nor the names of other contributors may be used to endorse or promote products derived from this software without specific prior written permission. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. shake-0.19.8/README.md0000644000000000000000000000452507346545000012362 0ustar0000000000000000# Shake [![Hackage version](https://img.shields.io/hackage/v/shake.svg?label=Hackage)](https://hackage.haskell.org/package/shake) [![Stackage version](https://www.stackage.org/package/shake/badge/nightly?label=Stackage)](https://www.stackage.org/package/shake) [![Build status](https://img.shields.io/github/actions/workflow/status/ndmitchell/shake/ci.yml?branch=master)](https://github.com/ndmitchell/shake/actions) Shake is a tool for writing build systems - an alternative to make, Scons, Ant etc. Shake has been used commercially for over five years, running thousands of builds per day. The website for Shake users is at [shakebuild.com](https://shakebuild.com). #### Documentation * **Why use Shake?** Shake lets you write large robust build systems, which deal properly with generated source files and run quickly. If you are writing a custom build system of any moderate size (more than a few rules) you should use Shake. The advantages over other build systems are detailed in the document [Why choose Shake?](https://shakebuild.com/why). * **How do I use Shake?** Shake is a Haskell library that you use to define your rules. The [Shake manual](https://shakebuild.com/manual) provides a walk through of a small but realistic example, assuming no Haskell knowledge. * [Generated documentation](https://hackage.haskell.org/packages/archive/shake/latest/doc/html/Development-Shake.html) for all functions, includes lots of examples. * [Blog posts](https://neilmitchell.blogspot.co.uk/search/label/shake) detailing ongoing development work. * [Academic paper](https://ndmitchell.com/downloads/paper-shake_before_building-10_sep_2012.pdf) on the underlying principles behind Shake. #### Other links * [Download the Haskell package](https://hackage.haskell.org/package/shake) from Hackage and install it using Cabal. * [Mailing list](https://groups.google.com/forum/?fromgroups#!forum/shake-build-system) for any questions/bugs/thoughts on Shake. If you need more information and aren't sure where to start, use the mailing list. * [Questions](https://stackoverflow.com/questions/tagged/shake-build-system) can be asked on StackOverflow with the tag `shake-build-system`. Is your company using Shake? Write something public (even just a [tweet to `@ndm_haskell`](https://twitter.com/ndm_haskell)) and I'll include a link [on the website](https://shakebuild.com/#who-uses-shake). shake-0.19.8/Setup.hs0000644000000000000000000000005607346545000012532 0ustar0000000000000000import Distribution.Simple main = defaultMain shake-0.19.8/docs/0000755000000000000000000000000007346545000012025 5ustar0000000000000000shake-0.19.8/docs/Manual.md0000644000000000000000000007113707346545000013575 0ustar0000000000000000# Shake Manual _See also: [Shake links](https://github.com/ndmitchell/shake#readme); [Why choose Shake](Why.md#readme); [Function documentation](https://hackage.haskell.org/packages/archive/shake/latest/doc/html/Development-Shake.html)_ Shake is a Haskell library for writing build systems -- designed as a replacement for `make`. This document describes how to get started with Shake, assuming no prior Haskell knowledge. First, let's take a look at a Shake build system: import Development.Shake import Development.Shake.Command import Development.Shake.FilePath import Development.Shake.Util main :: IO () main = shakeArgs shakeOptions{shakeFiles="_build"} $ do want ["_build/run" <.> exe] phony "clean" $ do putInfo "Cleaning files in _build" removeFilesAfter "_build" ["//*"] "_build/run" <.> exe %> \out -> do cs <- getDirectoryFiles "" ["//*.c"] let os = ["_build" c -<.> "o" | c <- cs] need os cmd_ "gcc -o" [out] os "_build//*.o" %> \out -> do let c = dropDirectory1 $ out -<.> "c" let m = out -<.> "m" cmd_ "gcc -c" [c] "-o" [out] "-MMD -MF" [m] neededMakefileDependencies m This build system builds the executable `_build/run` from all C source files in the current directory. It will rebuild if you add/remove any C files to the directory, if the C files themselves change, or if any headers used by the C files change. All generated files are placed in `_build`, and a `clean` command is provided that will wipe all the generated files. In the rest of this manual we'll explain how the above code works and how to extend it. #### Running this example To run the example above: 1. Install the [Haskell Stack](https://haskellstack.org/), which provides a Haskell compiler and package manager. 2. Type `stack install shake`, to build and install Shake and all its dependencies. 3. Type `stack exec -- shake --demo`, which will create a directory containing a sample project, the above Shake script (named `Shakefile.hs`), and execute it (which can be done by `runhaskell Shakefile.hs`). For more details see a [trace of `shake --demo`](Demo.md). ## Basic syntax This section explains enough syntax to write a basic Shake build script. #### Boilerplate The build system above starts with the following boilerplate:
import Development.Shake
import Development.Shake.Command
import Development.Shake.FilePath
import Development.Shake.Util
 
main :: IO ()
main = shakeArgs shakeOptions{shakeFiles="_build"} $ do
    build rules
All the interesting build-specific code is placed under build rules. Many build systems will be able to reuse that boilerplate unmodified. #### Defining targets A target is a file we want the build system to produce (typically executable files). For example, if we want to produce the file `manual/examples.txt` we can write: want ["manual/examples.txt"] The `want` function takes a list of strings. In Shake lists are written `[item1,item2,item2]` and strings are written `"contents of a string"`. Special characters in strings can be escaped using `\` (e.g. `"\n"` for newline) and directory separators are always written `/`, even on Windows. Most files have the same name on all platforms, but executable files on Windows usually have the `.exe` extension, while on POSIX they have no extension. When writing cross-platform build systems (like the initial example), we can write: want ["_build/run" <.> exe] The `<.>` function adds an extension to a file path, and the built-in `exe` variable evaluates to `"exe"` on Windows and `""` otherwise. #### Defining rules A rule describes the steps required to build a file. A rule has two components, a pattern and some actions:
pattern %> \out -> do
    actions
The pattern is a string saying which files this rule can build. It may be a specific file (e.g. `"manual/examples.txt" %> ...`) or may use wildcards: * The `*` wildcard matches anything apart from a directory separator. For example `"manual/*.txt"` would define a rule for any `.txt` file in the `manual` directory, including `manual/examples.txt`, but would not match `manual/examples.zip`, `examples.txt` or `manual/docs/examples.txt`. * The `//` wildcard matches any number of complete path components. For example `//*.txt` would define a rule for any `.txt` file, including `manual/examples.txt`. As another example, `manual//examples.txt` would match any file named `examples.txt` inside `manual`, including both `manual/examples.txt` and `manual/docs/examples.txt`. It is an error for multiple patterns to match a file being built, so you should keep patterns minimal. Looking at the two rules in the initial example: "_build/run" <.> exe %> ... "_build//*.o" %> ... The first matches only the `run` executable, using `<.> exe` to ensure the executable is correctly named on all platforms. The second matches any `.o` file anywhere under `_build`. As examples, `_build/main.o` and `_build/foo/bar.o` both match while `main.o` and `_build/main.txt` do not. Lots of compilers produce `.o` files, so if you are combining two different languages, say C and Haskell, use the extension `.c.o` and `.hs.o` to avoid overlapping rules. The actions are a list of steps to perform and are listed one per line, indented beneath the rule. Actions both express dependencies (say what this rule uses) and run commands (actually generate the file). During the action the `out` variable is bound to the file that is being produced. #### A simple rule Let's look at a simple example of a rule: "*.rot13" %> \out -> do let src = out -<.> "txt" need [src] cmd_ "rot13" src "-o" out This rule can build any `.rot13` file. Imagine we are building `"file.rot13"`, it proceeds by: * Using `let` to define a local variable `src`, using the `-<.>` extension replacement method, which removes the extension from a file and adds a new extension. When `out` is `"file.rot13"` the variable `src` will become `file.txt`. * Using `need` to introduce a dependency on the `src` file, ensuring that if `src` changes then `out` will be rebuilt and that `src` will be up-to-date before any further commands are run. * Using `cmd_` to run the command line `rot13 file.txt -o file.rot13`, which should read `file.txt` and write out `file.rot13` being the ROT13 encoding of the file. Many rules follow this pattern -- calculate some local variables, `need` some dependencies, then use `cmd_` to perform some actions. We now discuss each of the three statements. #### Local variables Local variables can be defined as:
let variable = expression
Where variable is a name consisting of letters, numbers and underscores (a-z, A-Z, 0-9 and \_). All variables _must_ start with a lower-case letter. An expression is any combination of variables and function calls, for example `out -<.> "txt"`. A list of some common functions is discussed later. Variables are evaluated by substituting the expression everywhere the variable is used. In the simple example we could have equivalently written: "*.rot13" %> \out -> do need [out -<.> "txt"] cmd_ "rot13" (out -<.> "txt") "-o" out Variables are local to the rule they are defined in, cannot be modified, and should not be defined multiple times within a single rule. #### File dependencies You can express a dependency on a file with: need ["file.src"] To depend on multiple files you can write: need ["file.1","file.2"] Or alternatively: need ["file.1"] need ["file.2"] It is preferable to use fewer calls to `need`, if possible, as multiple files required by a `need` can be built in parallel. #### Running external commands The `cmd_` function allows you to call system commands, e.g. `gcc`. Taking the initial example, we see: cmd_ "gcc -o" [out] os After substituting `out` (a string variable) and `os` (a list of strings variable) we might get: cmd_ "gcc -o" ["_make/run"] ["_build/main.o","_build/constants.o"] The `cmd_` function takes any number of space-separated expressions. Each expression can be either a string (which is treated as a space-separated list of arguments) or a list of strings (which is treated as a direct list of arguments). Therefore the above command line is equivalent to either of: cmd_ "gcc -o _make/run _build/main.o _build/constants.o" cmd_ ["gcc","-o","_make/run","_build/main.o","_build/constants.o"] To properly handle unknown string variables it is recommended to enclose them in a list, e.g. `[out]`, so that even if `out` contains a space it will be treated as a single argument. The `cmd_` function as presented here will fail if the system command returns a non-zero exit code, but see later for how to treat failing commands differently. #### Filepath manipulation functions Shake provides a complete library of filepath manipulation functions (see the [docs for `Development.Shake.FilePath`](https://hackage.haskell.org/package/shake/docs/Development-Shake-FilePath.html)), but the most common are: * `str1 str2` -- add the path components together with a slash, e.g. `"_build" "main.o"` equals `"_build/main.o"`. * `str1 <.> str2` -- add an extension, e.g. `"main" <.> "o"` equals `"main.o"`. * `str1 ++ str2` -- append two strings together, e.g. `"hello" ++ "world"` equals `"helloworld"`. * `str1 -<.> str2` -- replace an extension, e.g. `"main.c" -<.> "o"` equals `"main.o"`. * `dropExtension str` -- drop the final extension of a filepath if it has one, e.g. `dropExtension "main.o"` equals `"main"`, while `dropExtension "main"` equals `"main"`. * `takeFileName str` -- drop the path component, e.g. `takeFileName "_build/src/main.o"` equals `"main.o"`. * `dropDirectory1 str` -- drop the first path component, e.g. `dropDirectory1 "_build/src/main.o"` equals `"src/main.o"`. ## Advanced Syntax The following section covers more advanced operations that are necessary for moderately complex build systems, but not simple ones. #### Directory listing dependencies The function `getDirectoryFiles` can retrieve a list of files within a directory: files <- getDirectoryFiles "" ["//*.c"] After this operation `files` will be a variable containing all the files matching the pattern `"//*.c"` (those with the extension `.c`) starting at the directory `""` (the current directory). To obtain all `.c` and `.cpp` files in the src directory we can write: files <- getDirectoryFiles "src" ["//*.c","//*.cpp"] The `getDirectoryFiles` operation is tracked by the build system, so if the files in a directory change the rule will rebuild in the next run. You should only use `getDirectoryFiles` on source files, not files that are generated by the build system, otherwise the results will change while you are running the build and the build may be inconsistent. #### List manipulations Many functions work with lists of values. The simplest operation on lists is to join two lists together, which we do with `++`. For example, `["main.c"] ++ ["constants.c"]` equals `["main.c", "constants.c"]`. Using a _list comprehension_ we can produce new lists, apply functions to the elements and filtering them. As an example: ["_build" x -<.> "o" | x <- inputs] This expression grabs each element from `inputs` and names it `x` (the `x <- inputs`, pronounced "`x` is drawn from `inputs`"), then applies the expression `"_build" x -<.> "o"` to each element. If we start with the list `["main.c","constants.c"]`, we would end up with `["_build/main.o", "_build/constants.o"]`. List expressions also allow us to filter the list, for example we could know that the file `"evil.c"` is in the directory, but should not be compiled. We can extend that to: ["_build" x -<.> "o" | x <- inputs, x /= "evil.c"] The `/=` operator checks for inequality, and any predicate after the drawn from is used to first restrict which elements of the list are available. #### Using `gcc` to collect headers One common problem when building `.c` files is tracking down which headers they transitively import, and thus must be added as a dependency. We can solve this problem by asking `gcc` to create a file while building that contains a list of all the imports. If we run: gcc -c main.c -o main.o -MMD -MF main.m That will compile `main.c` to `main.o`, and also produce a file `main.m` containing the dependencies. To add these dependencies as dependencies of this rule we can call: neededMakefileDependencies "main.m" Now, if either `main.c` or any headers transitively imported by `main.c` change, the file will be rebuilt. In the initial example the complete rule is: "_build//*.o" %> \out -> do let c = dropDirectory1 $ out -<.> "c" let m = out -<.> "m" cmd_ "gcc -c" [c] "-o" [out] "-MMD -MF" [m] neededMakefileDependencies m We first compute the source file `c` (e.g. `"main.c"`) that is associated with the `out` file (e.g. `"_build/main.o"`). We then compute a temporary file `m` to write the dependencies to (e.g. `"_build/main.m"`). We then call `gcc` using the `-MMD -MF` flags and then finally call `neededMakefileDependencies`. #### Top-level variables Variables local to a rule are defined using `let`, but you can also define top-level variables. Top-level variables are defined before the `main` call, for example: buildDir = "_build" You can now use `buildDir` in place of `"_build"` throughout. You can also define parametrised variables (functions) by adding argument names: buildDir x = "_build" x We can now write: buildDir ("run" <.> exe) %> \out -> do ... All top-level variables and functions can be thought of as being expanded wherever they are used, although in practice may have their evaluation shared. #### A clean command A standard clean command is defined as: phony "clean" $ do putInfo "Cleaning files in _build" removeFilesAfter "_build" ["//*"] Running the build system with the `clean` argument, e.g. `runhaskell Shakefile.hs clean` will remove all files under the `_build` directory. This clean command is formed from two separate pieces. Firstly, we can define `phony` commands as:
phony "name" $ do
    actions
Where name is the name used on the command line to invoke the actions, and actions are the list of things to do in response. These names are not dependency tracked and are run afresh each time they are requested. The actions can be any standard build actions, although for a `clean` rule, `removeFilesAfter` is typical. This function waits until after any files have finished building (which will be none, if you do `runhaskell Shakefile.hs clean`) then deletes all files matching `//*` in the `_build` directory. The `putInfo` function writes out a message to the console, as long as `--quiet` was not passed. ## Running This section covers how to run the build system you have written. #### Compiling the build system As shown before, we can use `runhaskell Shakefile.hs` to execute our build system, but doing so causes the build script to be compiled afresh each time. A more common approach is to add a shell script that compiles the build system and runs it. In the example directory you will find `build.sh` (Linux) and `build.bat` (Windows), both of which execute the same interesting commands. Looking at `build.sh`: #!/bin/sh mkdir -p _shake ghc --make Shakefile.hs -rtsopts -threaded -with-rtsopts=-I0 -outputdir=_shake -o _shake/build && _shake/build "$@" This script creates a folder named `_shake` for the build system objects to live in, then runs `ghc --make Shakefile.hs` to produce `_shake/build`, then executes `_shake/build` with all arguments it was given. The `-with-rtsopts` flag instructs the Haskell compiler to disable "idle garbage collection", making more CPU available for the commands you are running, as [explained here](https://stackoverflow.com/questions/34588057/why-does-shake-recommend-disabling-idle-garbage-collection/). Now you can run a build by typing `stack exec ./build.sh` on Linux, or `stack exec build.bat` on Windows. On Linux you may want to alias `build` to `stack exec ./build.sh`. For the rest of this document we will assume `build` runs the build system. _Warning:_ You should not use the `-threaded` for GHC 7.6 or below because of a [GHC bug](https://ghc.haskell.org/trac/ghc/ticket/7646). If you do turn on `-threaded`, you should include `-qg` in `-with-rtsopts`. #### Command line flags The initial example build system supports a number of command line flags, including: * `build` will compile all files required by `want`. * `build _build/main.o` will compile enough to create `_build/main.o`, ignoring all `want` requirements. * `build clean` will delete the contents of `_build`, because of our `phony` command. * `build --help` will list out all flags supported by the build system, currently 36 flags. Most flags supported by `make` are also supported by Shake based build systems. * `build -j8` will compile up to 8 rules simultaneously, by default Shake uses 1 processor. Most flags can also be set within the program by modifying the `shakeOptions` value. As an example, `build --metadata=_metadata` causes all Shake metadata files to be stored with names such as `_metadata/.shake.database`. Alternatively we can write `shakeOptions{shakeFiles="_metadata"}` instead of our existing `shakeFiles="_build"`. Values passed on the command line take preference over those given by `shakeOptions`. Multiple overrides can be given to `shakeOptions` by separating them with a comma, for example `shakeOptions{shakeFiles="_build", shakeThreads=8}`. #### Progress prediction One useful feature of Shake is that it can predict the remaining build time, based on how long previous builds have taken. The number is only a prediction, but it does take account of which files require rebuilding, how fast your machine is currently running, parallelism settings etc. You can display progress messages in the titlebar of a Window by either: * Running `build --progress` * Setting `shakeOptions{shakeProgress = progressSimple}` The progress message will be displayed in the titlebar of the window, for example `3m12s (82%)` to indicate that the build is 82% complete and is predicted to take a further 3 minutes and 12 seconds. If you are running Windows 7 or higher and place the [`shake-progress`](https://github.com/ndmitchell/shake/releases/tag/shake-progress-1) utility somewhere on your `%PATH%` then the progress will also be displayed in the taskbar progress indicator: ![](shake-progress.png) Progress prediction is likely to be relatively poor during the first build and after running `build clean`, as then Shake has no information about the predicted execution time for each rule. To rebuild from scratch without running clean (because you really want to see the progress bar!) you can use the argument `--always-make`, which assumes all rules need rerunning. #### Lint Shake features a built in "lint" feature to check the build system is well formed. To run use `build --lint`. You are likely to catch more lint violations if you first `build clean`. Sadly, lint does _not_ catch missing dependencies. However, it does catch: * Changing the current directory, typically with `setCurrentDirectory`. You should never change the current directory within the build system as multiple rules running at the same time share the current directory. You can still run `cmd_` calls in different directories using the `Cwd` argument. * Outputs that change after Shake has built them. The usual cause of this error is if the rule for `foo` also writes to the file `bar`, despite `bar` having a different rule producing it. There is a performance penalty for building with `--lint`, but it is typically small. For more details see [this page on all the Lint options](Lint.md). #### Profiling and optimisation Shake features an advanced profiling feature. To build with profiling run `build --report`, which will generate an interactive HTML profile named `report.html`. This report lets you examine what happened in that run, what takes most time to run, what rules depend on what etc. For a full explanation of how to profile and optimise a build system, including getting accurate timings and using Haskell profiling, see [the profiling and optimisation page](Profiling.md). #### Tracing and debugging To debug a build system there are a variety of techniques that can be used: * Run with lint checking enabled (`--lint`), which may spot and describe the problem for you. * Run in single-threaded mode (`-j1`) to make any output clearer by not interleaving commands. * By default a Shake build system prints out a message every time it runs a command. Use verbose mode (`--verbose`) to print more information to the screen, such as which rule is being run. Additional `--verbose` flags increase the verbosity. Three verbosity flags produce output intended for someone debugging the Shake library itself, rather than a build system based on it. * To raise a build error call `error "error message"`. Shake will abort, showing the error message. * To output additional information use `putInfo "output message"`. This message will be printed to the console when it is reached. * To show additional information with either `error` or `putInfo`, use `error $ show ("message", myVariable)`. This allows you to show any local variables. ## Extensions This section details a number of build system features that are useful in some build systems, but not the initial example, and not most average build systems. #### Advanced `cmd` usage The `cmd_` has a related function `cmd` that can also obtain the stdout and stderr streams, along with the exit code. As an example: (Exit code, Stdout out, Stderr err) <- cmd "gcc --version" Now the variable `code` is bound to the exit code, while `out` and `err` are bound to the stdout and stderr streams. If `ExitCode` is not requested then any non-zero return value will raise an error. Both `cmd_` and `cmd` also take additional parameters to control how the command is run. As an example: cmd_ Shell (Cwd "temp") "pwd" This runs the `pwd` command through the system shell, after first changing to the `temp` directory. #### Dependencies on environment variables You can use tracked dependencies on environment variables using the `getEnv` function. As an example: link <- getEnv "C_LINK_FLAGS" let linkFlags = fromMaybe "" link cmd_ "gcc -o" [output] inputs linkFlags This example gets the `$C_LINK_FLAGS` environment variable (which is `Maybe String`, namely a `String` that might be missing), then using `fromMaybe` defines a local variable `linkFlags` that is the empty string when `$C_LINK_FLAGS` is not set. It then passes these flags to `gcc`. If the `$C_LINK_FLAGS` environment variable changes then this rule will rebuild. #### Dependencies on extra information Using Shake we can depend on arbitrary extra information, such as the version of `gcc`, allowing us to automatically rebuild all C files when a different compiler is placed on the path. To track the version, we can define a rule for the file `gcc.version` which changes only when `gcc --version` changes: "gcc.version" %> \out -> do alwaysRerun Stdout stdout <- cmd "gcc --version" writeFileChanged out stdout This rule has the action `alwaysRerun` meaning it will be run in every execution that requires it, so the `gcc --version` is always checked. This rule defines no dependencies (no `need` actions), so if it lacked `alwaysRerun`, this rule would only be run when `gcc.version` was missing. The function then runs `gcc --version` storing the output in `stdout`. Finally, it calls `writeFileChanged` which writes `stdout` to `out`, but only if the contents have changed. The use of `writeFileChanged` is important otherwise `gcc.version` would change in every run. To use this rule, we `need ["gcc.version"]` in every rule that calls `gcc`. Shake also contains a feature called "oracles", which lets you do the same thing without the use of a file, which is sometimes more convenient. Interested readers should look at the function documentation list for `addOracle`. #### Resources Resources allow us to limit the number of simultaneous operations more precisely than just the number of simultaneous jobs (the `-j` flag). For example, calls to compilers are usually CPU bound but calls to linkers are usually disk bound. Running 8 linkers will often cause an 8 CPU system to grid to a halt. We can limit ourselves to 4 linkers with: disk <- newResource "Disk" 4 want [show i <.> "exe" | i <- [1..100]] "*.exe" %> \out -> do withResource disk 1 $ do cmd_ "ld -o" [out] ... "*.o" %> \out -> do cmd_ "cl -o" [out] ... Assuming `-j8`, this allows up to 8 compilers, but only a maximum of 4 linkers. #### Multiple outputs Some tools, for example [bison](https://www.gnu.org/software/bison/), can generate multiple outputs from one execution. We can track these in Shake using the `&%>` operator to define rules: ["//*.bison.h","//*.bison.c"] &%> \[outh, outc] -> do let src = outc -<.> "y" cmd_ "bison -d -o" [outc] [src] Now we define a list of patterns that are matched, and get a list of output files. If any output file is required, then all output files will be built, with proper dependencies. #### Changing build rules Shake build systems are set up to rebuild files when the dependencies change, but mostly assume that the build rules themselves do not change (including both the code and the shell commands contained within). To minimise the impact of build rule changes there are three approaches: _Use configuration files:_ Most build information, such as which files a C file includes, can be computed from source files. Where such information is not available, such as which C files should be linked together to form an executable, use configuration files to provide the information. The rule for linking can use these configuration files, which can be properly tracked. Moving any regularly changing configuration into separate files will significantly reduce the number of build system changes. _Depend on the build source:_ One approach is to depend on the build system source in each of the rules, then if _any_ rules change, _everything_ will rebuild. While this option is safe, it may cause a significant number of redundant rebuilds. As a restricted version of this technique, for a generated file you can include a dependency on the generator source and use `writeFileChanged`. If the generator changes it will rerun, but typically only a few generated files will change, so little is rebuilt. _Use a version stamp:_ There is a field named `shakeVersion` in the `ShakeOptions` record. If the build system changes in a significant and incompatible way, you can change this field to force a full rebuild. If you want all rules to depend on all rules, you can put a hash of the build system source in the version field, as [described here](https://stackoverflow.com/questions/18532552/shake-how-to-reliably-automatically-force-rebuild-when-my-rules-change-becomi/18532553#18532553). ## The Haskell Zone From now on, this manual assumes some moderate knowledge of Haskell. Most of the things in this section are either impossible to do with other build systems or can be faked by shell script. None of the Haskell is particularly advanced. #### Haskell Expressions You can use any Haskell function at any point. For example, to only link files without numbers in them, we can `import Data.Char` and then write: let os = ["_build" c -<.> "o" | c <- inputs, not $ any isDigit c] For defining non-overlapping rules it is sometimes useful to use a more advanced predicate. For example, to define a rule that only builds results which have a numeric extension, we can use the `?>` rule definition function: (\x -> all isDigit $ drop 1 $ takeExtension x) ?> \out -> do ... We first get the extension with `takeExtension`, then use `drop 1` to remove the leading `.` that `takeExtension` includes, then test that all the characters are numeric. The standard `%>` operator is actually defined as: pattern %> actions = (pattern ?==) ?> actions Where `?==` is a function for matching file patterns. #### Haskell Actions You can run any Haskell `IO` action by using `liftIO`. As an example: liftIO $ launchMissiles True Most common IO operations to run as actions are already wrapped and available in the Shake library, including `readFile'`, `writeFile'` and `copyFile'`. Other useful functions can be found in `System.Directory`. shake-0.19.8/docs/manual/0000755000000000000000000000000007346545000013302 5ustar0000000000000000shake-0.19.8/docs/manual/Shakefile.hs0000644000000000000000000000131407346545000015530 0ustar0000000000000000import Development.Shake import Development.Shake.Command import Development.Shake.FilePath import Development.Shake.Util main :: IO () main = shakeArgs shakeOptions{shakeFiles="_build"} $ do want ["_build/run" <.> exe] phony "clean" $ do putInfo "Cleaning files in _build" removeFilesAfter "_build" ["//*"] "_build/run" <.> exe %> \out -> do cs <- getDirectoryFiles "" ["//*.c"] let os = ["_build" c -<.> "o" | c <- cs] need os cmd_ "gcc -o" [out] os "_build//*.o" %> \out -> do let c = dropDirectory1 $ out -<.> "c" let m = out -<.> "m" cmd_ "gcc -c" [c] "-o" [out] "-MMD -MF" [m] neededMakefileDependencies m shake-0.19.8/docs/manual/build.bat0000644000000000000000000000021207346545000015064 0ustar0000000000000000@mkdir _shake 2> nul @ghc --make Shakefile.hs -rtsopts -threaded -with-rtsopts=-I0 -outputdir=_shake -o _shake/build && _shake\build %* shake-0.19.8/docs/manual/build.sh0000644000000000000000000000021607346545000014734 0ustar0000000000000000#!/bin/sh mkdir -p _shake ghc --make Shakefile.hs -rtsopts -threaded -with-rtsopts=-I0 -outputdir=_shake -o _shake/build && _shake/build "$@" shake-0.19.8/docs/manual/constants.c0000644000000000000000000000011107346545000015453 0ustar0000000000000000 char msg[] = "Hello Shake Users!"; char* message() { return msg; } shake-0.19.8/docs/manual/constants.h0000644000000000000000000000002107346545000015460 0ustar0000000000000000char* message(); shake-0.19.8/docs/manual/main.c0000644000000000000000000000014707346545000014374 0ustar0000000000000000#include #include "constants.h" int main() { printf("%s\n", message()); return 0; } shake-0.19.8/docs/shake-progress.png0000644000000000000000000002302007346545000015465 0ustar0000000000000000‰PNG  IHDR%(¸Žì%×IDATxd˜9’˜7Dû÷ „s‡Jꌺã„@fûòU½¢¨ÃåÓoß|Ëv'™Ôl6I%I*Æ~­mŸÌ&]É䬟ëçV';Iúœ;’*–æág+ë·Fï·tš½2;éÏÔH~‡Ç¥Óô&?­Þß>R3ul’­5ÇÎ ¶v6ƒü£5{}ôÓ}09oÎÅ»“½ï%`ºWý{¿ï>¬öÌ/~Ã9ô7ê¾··¾¨^÷Yû×øÙ—æÃúú%Ú?ëàxq›x0À=ÁAúˆ¹ØÎêÝÇGÒ[?w*ŸÏ’ËÚõ-R}†a¹âÁ<Øóoà6gÀ¸ùHæê4Æøò¯[®n?…üS|ˆ»Ä4÷a}á©lPœ°oß;†{í? érmÿßß?¾‰HrXãL%ˆ&y‰(³AM”`^ç –ºÁœÖ«`+@'°IlìB²›p^CÞìø«‹A´óòñ•ä4\éÜÙùD %F¥ëd}˜÷½;ê=I˜`k´ló¼˜{׉ä ×ÏŸm}Ø"ÌØ‡²‡`á¶9Ã9§:’â›lmÖñáŽ}Ëíä3àüÚǼ®îõ3ž†„žøÍ0kwV…?ï\½cÿ9q–¢l y˺* Š£%Uî¿…Mv/^&Ùíè<|| „‹ l͵_XŸ_PÉ@T PûÌæ&¤§‡‚N4(lÒaHxkŽ{ƒ?&nOÂY£J»Ì§RÈîN]‡av…J!`†èêÌ›$‚óg Jg汯ÞÈìuÀ%vÀ¸¸NÅÎÒŽ‚’}Ù‹L t}Ö ï+~¤®íØ‹¬ŒÖ‡ sð­ŠÉªØ}ÏUƒ8ž0kêâ)ÌÄ&ì«IÊd&P{YOnñZ’ •¹¶-ê ÔI2ßÔáÔèsq¡è€øa†HQÔ\ëßà’:ygÞ5w\Ø‹-èjüЗ{Ø[N”$›µŽ—–ÖÛ퀰—ï^<ÄÝra6*×÷Ä&»àû ug»¥šªƒDm“lz`TJ§1ꎜm»´aìØ;K¨‰,t%Ð$Yetˆ°;äÜ2i‰·±E•Œ€¬”ðѸUxϬÕù´ÉÐÙøýÎú—è€lðŒHŒ,KC˜^ò÷§Z~éÕ†pYÎ];ðßSºz ,’:'9§jiY.:Tåû¾Afø)Â+qÝ‚³)~Ñ¡/£í ù\I‡â>ìN¹Û±ž¤\î)© []ÅûIÛõÞwïmÀ²m‹ üwŸÂÀì+/óT2™•Ö„§K$þ¥KV¬.:åßüeOrkÌA¿RÙ "SÅÂæ>ÙÖ•/È}Nôç„ü¹ëY/8%ѷŕޥý<º^ ªÊïwµ?¡V­¯>%¨~~×o=ëËm>kÜñâ—ð.6á n;0ÂÏèü|fùF=x;™9ësÅíz"lžÄ.œ´#¹þ|¸<³|ìéåþ±ßŸi¯/ßd‰ 6—Åâ'Àñ-èh<Ÿ7ŽÉñÛ¯lÛ ¯â¹ýi ÔqãÉöÿvj>ÿe×,€É™=þ“fL±“e¦cfzÌÌÌÌÌÌ™™™™™™á˜öîv7ÙpÌêW稒֔¼þrôѵkjTcuK#øëßݳs8 ]}ÖAϼvã#Ñgã±MÏÒkyªÏ~~' ~HÊ^E˜)ù¸ÆòÅòþÜÝ•ÖXFÝ6Ív‡0h“Ç78~¢ojàUyU^•WeèžzºkÑ'Ÿ"µAðxëÑ“òäÃÇi—gxàöëx‹;¯ãÆ3Gè4Ôj)y^Ðyø™ËüÝ>Î?ÿßã í³\}×ÒúÂѵƒ»1©0‹ÀÈøç*à©P_±å‚(ÄÆÓþ0®XDÈ Ô¿š=T°Ï"Â!<-#Yðtöì 2}óºú×ÿµ« Oi(f¦˜GÅ©°Ìf¥!(×1¤ñºžQÓÔóTa¶ ÛÁ‹æÕ:Ã4RÁßÞ¾ÎX*ÖÝ ÿS“#"¢_À`tÀ_±Ê‰0Ò€ñDYªÌkN¥¼ˆXTÈÆ}¨CÅitÈo³E“aÐF8šÍé÷l*6®3æZô8Æ2ŽŠ5*&¥“æÝîîÝty«ÔØRàÒn.wxø¯¯•7¹á^>íýÞ–N¤f„q–‘å%EéqcIK½–P¯ÕÉyv•ü¿â_Ÿüw®¾ÿ‰;rzå>2a[_6z‚Õ&QTÏb CM»€(ßÛo-ÊÉô™Ec=@éñR.rY"©Ý‘Þ8¨¾„µüä©É„í¾´†*¥¯êáEv˜•ô¶Âm¨ÁF¥z«K ¥?q÷ƒ„BµÚÕG½k5¢Ü¬Ð+?í3…JVªIÕ½ð°§ô« ƒÎž…¶ÕØ{im»B@w];6Z~þôœ†™4µ" V¼Vü‚CN¿·vñÇN¯k•}¯¾· ŒÂ½̉·Q‚€½êcóyPz³€u(QÈ«Þ]±éÃÓÿq•, ïà›?󏿸~6»}zÃŒ¬t”¥Ã9A]·Ï€Á`­!I,õÄÒiÕYè´ybq/ý¾_g½þß\{ßÓÆXªþ>€¦Q±)åÂz…ùT»¥’¬,Âôz$p­ü_ÕVdÐ,rèuõééûäHÇÃtœIžZ|á÷<>Ô§l„?µ¢s9±Â:^[a³2 ° OÕðt c““»ùŽF}f)F25ª¬%;Ô¢@I±hÍ5€…ß™@:ô-ªÃS1òà·*Ù`=j3«,¨ö0±×s« GÅïÂhœÊêù÷EÀˆÊ¼D¾pT7vóz”ûÕ`¢2æ‹ÂàT‘%üÇßÜ$õ–ïÁǿ۬oõØèÈŠ’¢Jçp"ˆ¿´c&—5†ÄZÒÔPOö·›XèðãðÏüÌßþ·¼Íš$-ýKÝÙÄ0ê)G³%”SÐäY…šü¸«œê4ãŸêÆ vȶé™òqjD¼M ÕS—`1vô‡”jL4ð‡Ð6 7¦v£Ãì¨N*xp \,Õ?ŸâÁZíæ*Ô×}Ð ­Ýk`…,[ÐìSÏyÕ5o)355+Œ%ü®GGKÍÙzèU>èe}Íð»=´F,1¡³aò)Ð  –¿ß†z!ËT`Iè2‡€ε9ußMÂÅ‘2Ø÷Ö|×~47Ÿ9ÌsË ³bÊm@ÏDˆŠñŒÉXLÉ60µê)§ìç¡gWøœoýiZ›¥ˆÛ`o"ÎÑ[\c´ÕGœð:%Ì Ì5ÌT›³•csc̨ؓofèͨ¤ª¾È~³×Î0=»ÿÆZêsMZç1ÆÆf¡þÊü5ò[{àÿïù« lMîÆßý¾ÞÀëÆ@¦l—€S—èûÞVßËPÛ+¤_ð5_Îk*i’pbÿQ¾îGÿ‘/üÐwæÈ|“ŸºÈ +È GQ:ò¢ ËÆã1EQàD&nZZkPo4IÒtgqåÎ%vûª¥–­Þ€“‡÷ñ}_ò1|ë/ç+>ñ͸´q™¢,g.Fs…!ù³_ù]6?zÈÇúÇsGíFðR§ÆëšrÅç‰Öu¸ÈóÙõKÜL{.þèíÞ†›O♥u†YÎ8w"ŒÆËÝ|šÃëÏÐX_…,£ètX:|ŽË‡o$Ùw”æ\›¤ºAÏš 5K«^ãì±<|a•_þó¿âž…ÿxìAµ¦33e#?úÿÈ7?ö½¼ÕÚ ÆÃÊ‚LU®¶­'á5z‘»Š ÙÕ3êC`ín0­Ú´äR({eØ'#ÊžPª~dà¥Tz…b¬¥”ˆ•M(Þ^!¹×+võ‚röÃ*=+Á˜‰J)¼=·A‰„ï²óÜÏÅ( šmÍ®ù‰£8ç¢@$Ÿ;‰ÔAÏ„‘Žì?ˆ+J¾ï{¿—wßÅ0@{)ýeOf*{òw²'ñìÉ3§±™Åž¤ÂžÌ‹¥½*¦ÓuÂòÉÃÇèöZ8ÄͧòÈùEú£Œqá(vT»øo÷w¿Mss‰ZËRk[’†Á䆛פwáøŸ«ß‚GÜOs®ƒ±v—ä?©…FjŽÆÜtö‡NÐí¯Núpay1Ê40É”“¨gÜÙº‰î¨·ËºŒxZ=Û£Ú1B\B@¤€’ â dª6¦¢‘ s(âìBJD—ßüºm Dº¬õD< j5ËhÅLå¹(@ ë‹ør•ªwQ€(l`3‚ø²C¶/ç r.>_"Ó˲]˜©·¼±Æu§Î"ÎñbÅì‘ùHd3Èl;Èø—_R”è—Ð]Hl¹c'ùÃ*øúO¸Ÿÿ}â¹IP{˜”œ1 {=î¸ô íá*­cujó–tÎ’Ö-¶f1æóvð4ÿ¾y–Òm0ÌZ4Zm’$Å8‡s.c¤ Ýj2ø¤w¿Ÿ/ÿ±ßå]ßô$‹«Ë”®œ LÓ¦tŽ&‹ƒÕ`Ã[cuµ©%†8(érušÊb6ÈŽ=Q›G‹£ŒÛ@ fVv7˜ 7›Ø¸xfc%ØÔNƒHÈr4ÈM#šlFB–Úðý(¥PÌK3%J¶ û¡AId(‘EÛHRF4ÈŠçô¼hPb Xi‘ øL ùvR'Mx±"ì!î(/¬˜°1á•—Tfw›ã‡ŽðÌ’ð–·]ϳ‹Ë,­nÐd¥L ,ý^3½K´YšSó)µŽÅ¶ ¶nÀÖY~kp‡lpö–'xú¡3\\qßf2©P¶?³X™ (/?“1¼È¡3)ë«sllYŘ²Ë²Ë8u ÃåG†Ìw<öì"ovóÕ|ó¯<ÂÛß{XÒØ’ÿO€¼Ð§¸fJ!+±¶bËTÚ2&tü(sˆÖ c@¢ ÌHÃ0%½y(ÉN=h© bJ%8ñöX7¹Çôr2 "p‰¨˜’ $¶K®ç$"A1@P dª. † f¢]¼ÐÅ609NéÜÁÅ 4*$¶.mHA^*Þð$õZíÖi2Ï©C ž[\am³GT‹A µäyÆmË´9Íý5ê-éó ´`HæÀÔÁ¤ÂÅç²áÎMIì!j¶G½Ö#il"óPô…úšÃ„´;Ïj·I†Ð›´}úðaR;¦ÝjÑ÷¦Ys4SË ;Gw8G­•bœ#uÛm;ý“>M—p!˜p,%Žq‘a•Kft% )ÿßÞY@9r\_ÿWÕ- -3ƒíE33CÍfNÖ_˜ÍfNÌœµË/Ï,ƒ°¥îzßn½}¬ÑÈ3ñúŸìÞsj$5¼ªîQ]Ý÷êUõÙç]À@øþ=àW¿2ð:kL8H7xœ0éxNÜ[ú°¸éžn~ô~ä©%Û@ï-';™ãÆÇ_¶þ™5kˆé×~ÑFßõOì~ì…ŠG•±zóÄËhŒ4ró–ïrÓaŸ!fÅqŒÃ÷7“ÅO‘ÇùòaßfLt,?ÛýC~¾çG|hâ"N®? €Ÿ¶ÞËÏö€oL»™¥Ÿæ§m÷ðÍ)·ÒRÜÍ—v}2H•ˆ§ŒÊ‘JjHLJ)èþŸ€ ~¸îå‡ê—˜lTå“¢á0ÝÙ"íÝY,ÛBi‹‚ñÈ]ò¹¢aDAó ŽgEf8…H’P$ ZHµæ¨ÝTàmO-£¡!!ì Xâ²Â¸^„%¤Ã„l‹pȲm­q:ó Û° ®-¿î}mèÊxDÑAÝqm[äqÈ:Ù€‹ P¥TK-¹TáÚמì·ÿëâ§ÿIo>på¶tlå¡pÚ´“9~Âñ,Ûµœö|ÛsuKÿ*ǿ޷Ížx8®CÎÍ“)¦X8r÷5ÿ•U+9{ÂÙœ<úÔ½¤ô8-¹=t¸íô!Oî±–“GœJ}¸¬~?Ÿ>ü‹,íZ—ŸýÜ^¢ºœ7O¸’uÿ“Ó†ÉwÞÁSóæqWòêQ¯ç±Î‡ÃV_L]ÏG&ÿ?~×ö ÖMÞËqoëí|¬é½|mÊ8¡fo»Rà H©‚ªœ¼XÞé×~_)áz.ƒÇ S©E^þXÎK92'/æ\UQŠÚÕiiMo¶À£KÖ‘Œ†iMåé-‚kE0¡(ˆÇªèDÂc&’˜2›…ãG2¥6ÌøÆ5QÅÖÍ›¹çþ14o/pâÌnâvK E¯€Ö6ZùD‡¥4![ü¹o»Zb´å…Ž|Ûs¨ ÁÈd”Ôæ]ôf5ZG¨²ß¨›Ö/ dĈ¿=ø…‡c>ŠÁà›ß¹9àþxÆc]ëzŠå;W°pÌ\¯HÁ+–ÿÒUŽŸ˜úå}™±>9ñètºøã¶?ÐUzÿç¦?q̈ãˆÛqvfv](ÿ}PAœ8üd¶g›Â:Ìâö'}"hÎm£`ÿ˜ßìù%}x¬ëa.ñJ&Ǧ²%·)àþ$¬$oy9_lþäsÛzÝnžì}„× »”ÇS"A”OЬ”?¦ ¥T~ª¯·Åü‡ü©b“B)ÍKõ2“ªÞì½µ©Ú¢iw+ˆ@8ŠŽÄÑ! mYØbˆ,8\<ÊØÚµ&OÂ@ÒÓŒ±ÂÌž? ›Ã¥Ÿ½•¥M†·Üò/®={‡«'l[=ËÕE³«ÇbͿô¤êÑÑò Bk6Kk[(E¨nÌ‹¼Û J´¤,M—‚[ Æ”¼`z¨È²ÁZœ9í z;S»‘ÆÛy«ZVqì¸cY!Øü £jF±`Ô|V´¬àæU7³b¾Zjɶ²;³ëöÇ%ÆÑSèayûRvœ¯Ÿð-ÆÄÇð“­?â§{Kj¬&Ä'ò³¦Ó]ìâ©öÇyçô÷‘vS¼kêøÝî_Ñ]ìÙ”èTºÝ.–öüƒ”×ËNg¿ž÷W~¸ç®}#O÷Ã'ŽZï\üv€çFák}„Eó?Éç>ʷܽ·Üë{­µëé¶<ä·ãw»~É©ÃÎ`jbM][øäÌ/pãÚ+ËŽlÙp.lÌ=Ë¥#¯æ¢Õ§S®åS¿Ì¯ÚB­Uï+¥ ùc¨ r*޾ Î}+S$lËêG) ÿ•—ȨàÈöz³i¦ÎúŽ$äºA+PÄŒFHIq VÑÅ-¸ìv,’–&Tâ‘=]=ô8.8,„ZQn/ÊKc¼\Óò:Áq0auÏ6ôÖ "þ’|ˆOFúù:£I¦ŒÔ~›¨ÞM Œ¬(ún­t¥_ç Ê(¥Òû•R0‡S&ŸDÎͱ|÷ jãÉ统ø¥¯®àçRgúö?¬½|æL¯›Á;Ÿ|gÙÿÙŸ„ŸbRbr 3w:I»i¦ÖLcSvq+Áé#ÏâÞ¦;YXw4õ¡z6¥7påøë˜‘˜Å[W_´_jg­UÇ¥£¯æ“Û>²ßµì×V Ó£3I{)º¼ö HuîJÇ?W°aJ„.’Jß3p`\ J-ýÏ›=oÝîeÎdcÃQ¤Ô!ñ\†³5[ðGÌB…"Ú¶IÙ[µ„"šŽÛ¨å™<îTÆe¸ãµ73¦v €Ïzï_>è§ $cÉÒù9\û½bvŒ»Ï»›±5cøÄ±‹¸&}-úãœ=þlÆ&ÆòÛ ~K~¸á®œy5€#zû?n´«Çí¥)»SGžÎ£ëäkÏ~‘Î^ÄÕ“¯àî¦Ûé*vqîˆ ÏŸŽy˜>ܵãîÚy+‚ðÁIgG¾™fg+;œíÜ7ïIîlù>=^7“"SpÄ¡ÇíF,¢ûVù8`@÷¬”´Òˆ d4Wå#¬° ™grY\/ÅÈx„52m€¯H¤~¦ÇPT \+—È¡]âEÑe …ö&ν1E”×R@Ieº Ð‹I0)ÁÉùWïdt¬?K¼¤’¤¤ÊHŽddÌÅÇoÓ`òF”¥É’Ç5"$±þÜ9¥€ª]º2®\÷ë·•šßÞ»7ýúò¾zýÏoüí¥ø°JŸÿð&ÿœû¶ÞWÖÆ•_¾ßvýqÏ–;ûͤöcV-qå”kü˜Ò£mù‚öß´ì•zƒnÚò¡€ýEÛ>Û@´à+©3kÏã®¶ eé£RþÑ HI¨`#HýýB;ô^Á}ó e¡8ø CM hélç¸)cøÓæz$ßRPS#m4á|;*o0¶Øž¶È[ar #ÈKÜu¸dÊ6ÄuЦX"§)Yù< @¥-hœÈ|o=5ñ"wîAú}¬€‘ ÷Z ó"gmÂ!¨²^Dõ¤b Õÿ’ªRrà](ú¬›ÊPÈÚHuvE„µÛ6pܬ#èXµ‰–qsP]m Ñ%wKiÐÏÒHØFÇB¨H Œ¨gqa<‹sag{‰9)¼¢•:fa%–gíø6‚7•q-k9n>¬Ø´1à—K<òêa‘Ò! q~†ü‡ @ú#¿ÀC† U¤ ýV(T¿1PØBõp Ö5mâ‚y3ùÍ’uäg…ÎdñŒ”T’K#! ¡âaˆ†F©0Ïó“,!_Då\ÈPù8Ês1F°"!6‰Äú¥œ´ç×™/8T‰ŠÄäáq‡pÀ'ŒÉ@6UÊB1hÈÎàü­ÖÈ‹«$•I±u×V^}¤Ëˆ-KÚjD”R}ÏGBECÛWl¿HÜ/C"‰(*öÅÖøç ¾­}6Gl^²·Ž¢_Wo&õâïº*¯ör8h­98„C„ GpŠÅ!’HµZé;9yJÕ“±ÿ,«‚[äܹÓY²ñÖÆ¦>æH,ÇÆ”ˆI…-d_‰X˜°…±Ú´öPFPE )’އ¡ASX²œ#²9jžâÙæMô¤Sý°·¼hŤìòZ™Ó8––žVD Ž[à?…C8„šX‚>1y[v4¡më/ôòRn Ä£%h»êknöÉbåæµ,˜6“i™-üí‘=ôN›ƒ}ølìš0Ö{_±ðBž­°\ƒ2 mƒkTm¥ ×o¢vÓjNš¥vŒËÊM|wq¨? j?bŠÔ&Xóóä_;›1‰‘ô!I”¡"°P\¿ÛKÄŒ.»_£(ƒÀñz@ÁcËc¡+¶)hûyXXÚÖèªê Ö„®úžU>¯Â6 Ä+÷ë!RÒû%÷nlÚÊ}÷ßG|xƒ‡üw E¡2Tùd‹}ÒX¾a­ÿäÚWÏGSë?Yù‡•ìi‹ž; kê8ˆÕC8†„> Žƒdº)lÙ‰Y³QÝ;Y0Úaâ Shake report Loading... shake-0.19.8/html/progress.html0000644000000000000000000000216207346545000014574 0ustar0000000000000000 Shake progress diagnostic

Shake Progress Analysis

The red line represents the real performance, with time to completion on the X axis and predicted time on the Y axis. A perfect progress prediction would follow the gray line.

Loading...

shake-0.19.8/html/shake.js0000644000000000000000000011661507346545000013504 0ustar0000000000000000"use strict"; function bindPlot(element, data, options) { const redraw = () => { if ($(element).is(":visible")) $.plot($(element), data.get(), options); }; window.setTimeout(redraw, 1); $(window).on("resize", redraw); data.event(redraw); } function varLink(name) { return React.createElement("a", { href: "https://hackage.haskell.org/package/shake/docs/Development-Shake.html#v:" + name }, React.createElement("tt", null, name)); } function newTable(columns, data, sortColumn, sortDescend) { const f = (x) => ({ name: x.field, label: x.label, width: x.width, cellClasses: x.alignRight ? "right" : "" }); const formatters = {}; for (const c of columns) formatters[c.field] = c.show || ((x) => x); const table = new DGTable({ adjustColumnWidthForSortArrow: false, cellFormatter: (val, colname) => formatters[colname](val), columns: columns.map(f), width: DGTable.Width.SCROLL }); $(table.el).css("height", "100%"); window.setTimeout(() => { table.render(); table.tableHeightChanged(); if (sortColumn) table.sort(sortColumn, sortDescend); table.setRows(data.get(), true); }, 1); let toRender = false; data.event(xs => { table.setRows(xs, true); if ($(table.el).is(":visible")) table.render(); else toRender = true; }); $(window).on("resize", () => { if ($(table.el).is(":visible")) { table.tableHeightChanged(); if (toRender) { table.render(); toRender = false; } } }); return React.createElement("div", { style: "height:100%;width:100%;" }, table.el); } // These are global variables mutated/queried by query execution let environmentAll; // All the profiles let environmentThis; // The specific profile under test let environmentGroup; // The group produced as a result function group(x) { environmentGroup.push(x); return true; } function leaf() { return environmentThis.depends.length === 0; } function run(i) { if (i === undefined) return environmentThis.built; else return environmentThis.built === i; } function changed() { return environmentThis.changed === environmentThis.built; } function unchanged() { return !unchanged(); } function named(r, groupName) { if (r === undefined) return environmentThis.name; const res = execRegExp(r, environmentThis.name); if (res === null) { if (groupName === undefined) return false; else { group(groupName); return true; } } if (res.length !== 1) { for (let i = 1; i < res.length; i++) group(res[i]); } return true; } function command(r, groupName) { const n = (environmentThis.traces || []).length; if (r === undefined) return n === 0 ? "" : environmentThis.traces[0].command; for (const t of environmentThis.traces) { const res = execRegExp(r, t.command); if (res === null) continue; if (res.length !== 1) { for (let j = 1; j < res.length; j++) group(res[j]); } return true; } if (groupName === undefined) return false; else { group(groupName); return true; } } function profileLoaded(profileRaw) { $(document.body).empty().append(profileRoot(unraw(profileRaw))); } function unraw(xs) { const ans = xs.map((x, i) => ({ index: i, name: x[0], execution: x[1], built: x[2], changed: x[3], depends: x.length > 4 ? x[4] : [], rdepends: [], traces: x.length > 5 ? x[5].map(y => ({ command: y[0], start: y[1], stop: y[2] })) : [] })); for (const p of ans) for (const ds of p.depends) for (const d of ds) ans[d].rdepends.push(p.index); return ans; } function profileRoot(profile) { const [s, search] = createSearch(profile); const t = createTabs([["Summary", () => reportSummary(profile)], ["Command plot", () => reportCmdPlot(profile)], ["Commands", () => reportCmdTable(profile, search)], ["Rules", () => reportRuleTable(profile, search)], ["Parallelizability", () => reportParallelism(profile)], ["Details", () => reportDetails(profile, search)] // , ["Why rebuild", () => reportRebuild(profile, search)] ]); return React.createElement("table", { class: "fill" }, React.createElement("tr", null, React.createElement("td", { style: "padding-top: 8px; padding-bottom: 8px;" }, React.createElement("a", { href: "https://shakebuild.com/", style: "font-size: 20px; text-decoration: none; color: #3131a7; font-weight: bold;" }, "Shake profile report"), React.createElement("span", { style: "color:gray;white-space:pre;" }, " - generated at ", generated, " by Shake v", version))), React.createElement("tr", null, React.createElement("td", null, s)), React.createElement("tr", null, React.createElement("td", { height: "100%" }, t))); } function createTabs(xs) { const bodies = xs.map(x => { const el = React.createElement("div", { style: "padding:5px;width:100%;height:100%;min-width:150px;min-height:150px;overflow:auto;display:none;" }); const upd = lazy(() => $(el).append(x[1]())); return pair(el, upd); }); let lbls = []; const f = (i) => () => { bodies[i][1](); lbls.map((x, j) => $(x).toggleClass("active", i === j)); bodies.map((x, j) => $(x[0]).toggle(i === j)); $(window).trigger("resize"); }; lbls = xs.map((x, i) => React.createElement("a", { onclick: f(i) }, x[0])); f(0)(); return React.createElement("table", { class: "fill" }, React.createElement("tr", null, React.createElement("td", null, React.createElement("table", { width: "100%", style: "border-spacing:0px;" }, React.createElement("tr", { class: "tabstrip" }, React.createElement("td", { width: "20", class: "bottom" }, "\u00A0"), React.createElement("td", { style: "padding:0px;" }, lbls), React.createElement("td", { width: "100%", class: "bottom" }, "\u00A0"))))), React.createElement("tr", { height: "100%" }, React.createElement("td", { style: "background-color:white;" }, bodies.map(fst)))); } // A mapping from names (rule names or those matched from rule parts) // to the indicies in profiles. class Search { constructor(profile, mapping) { this.profile = profile; if (mapping !== undefined) this.mapping = mapping; else { this.mapping = {}; for (const p of profile) this.mapping[p.name] = [p.index]; } } forEachProfiles(f) { for (const s in this.mapping) f(this.mapping[s].map(i => this.profile[i]), s); } forEachProfile(f) { this.forEachProfiles((ps, group) => ps.forEach(p => f(p, group))); } mapProfiles(f) { const res = []; this.forEachProfiles((ps, group) => res.push(f(ps, group))); return res; } mapProfile(f) { const res = []; this.forEachProfile((p, group) => res.push(f(p, group))); return res; } } function createSearch(profile) { const caption = React.createElement("div", null, "Found ", profile.length, " entries, not filtered or grouped."); const input = React.createElement("input", { id: "search", type: "text", value: "", placeholder: "Filter and group", style: "width: 100%; font-size: 16px; border-radius: 8px; padding: 5px 10px; border: 2px solid #999;" }); const res = new Prop(new Search(profile)); $(input).on("change keyup paste", () => { const s = $(input).val(); if (s === "") { res.set(new Search(profile)); $(caption).text("Found " + profile.length + " entries, not filtered or grouped."); } else if (s.indexOf("(") === -1) { const mapping = {}; let found = 0; for (const p of profile) { if (p.name.indexOf(s) !== -1) { found++; mapping[p.name] = [p.index]; } } res.set(new Search(profile, mapping)); $(caption).text("Substring filtered to " + found + " / " + profile.length + " entries, not grouped."); } else { let f; try { f = new Function("return " + s); } catch (e) { $(caption).text("Error compiling function, " + e); return; } const mapping = {}; let groups = 0; let found = 0; environmentAll = profile; for (const p of profile) { environmentThis = p; environmentGroup = []; let bool; try { bool = f(); } catch (e) { $(caption).text("Error running function, " + e); return; } if (bool) { found++; const name = environmentGroup.length === 0 ? p.name : environmentGroup.join(" "); if (name in mapping) mapping[name].push(p.index); else { groups++; mapping[name] = [p.index]; } } } res.set(new Search(profile, mapping)); $(caption).text("Function filtered to " + found + " / " + profile.length + " entries, " + (groups === found ? "not grouped." : groups + " groups.")); } }); const body = React.createElement("table", { width: "100%", style: "padding-bottom: 17px;" }, React.createElement("tr", null, React.createElement("td", { width: "100%" }, input), React.createElement("td", { style: "padding-left:6px;padding-right: 6px;" }, searchHelp(input))), React.createElement("tr", null, React.createElement("td", null, caption))); return [body, res]; } function searchHelp(input) { const examples = [["Only the last run", "run(0)"], ["Named 'Main'", "named(\"Main\")"], ["Group by file extension", "named(/(\\.[_0-9a-z]+)$/)"], ["No dependencies (an input)", "leaf()"], ["Didn't change when it last rebuilt", "unchanged()"], ["Ran 'gcc'", "command(\"gcc\")"] ]; const f = (code) => () => { $(input).val((i, x) => x + (x === "" ? "" : " && ") + code); $(input).trigger("change"); }; const dropdown = React.createElement("div", { class: "dropdown", style: "display:none;" }, React.createElement("ul", { style: "padding-left:30px;" }, examples.map(([desc, code]) => React.createElement("li", null, React.createElement("a", { onclick: f(code) }, React.createElement("tt", null, code)), " ", React.createElement("span", { class: "note" }, desc))))); const arrow_down = React.createElement("span", { style: "vertical-align:middle;font-size:80%;" }, "\u25BC"); const arrow_up = React.createElement("span", { style: "vertical-align:middle;font-size:80%;display:none;" }, "\u25B2"); const show_inner = () => { $(dropdown).toggle(); $(arrow_up).toggle(); $(arrow_down).toggle(); }; return React.createElement("div", null, React.createElement("button", { style: "white-space:nowrap;padding-top:5px;padding-bottom:5px;", onclick: show_inner }, React.createElement("b", { style: "font-size:150%;vertical-align:middle;" }, "+"), "\u00A0 Filter and Group \u00A0", arrow_down, arrow_up), dropdown); } function initProgress() { $(function () { $(".version").html("Generated by Shake " + version + "."); $("#output").html(""); for (const x of progress) { var actual = []; var ideal = []; // Start at t = 5 seconds, since the early progress jumps a lot for (var t = 5; t < x.values.length; t++) { var y = x.values[t]; actual.push([y.idealSecs, y.actualSecs]); ideal.push([y.idealSecs, y.idealSecs]); } var ys = [{ data: ideal, color: "gray" }, { label: x.name, data: actual, color: "red" }]; var div = $("
"); $("#output").append(div); $.plot(div, ys, { xaxis: { transform: function (v) { return -v; }, inverseTransform: function (v) { return -v; } } }); } }); } // Stuff that Shake generates and injects in function untraced(p) { return Math.max(0, p.execution - p.traces.map(t => t.stop - t.start).sum()); } ///////////////////////////////////////////////////////////////////// // BASIC UI TOOLKIT class Prop { constructor(val) { this.val = val; this.callback = () => { return; }; } get() { return this.val; } set(val) { this.val = val; this.callback(val); } event(next) { const old = this.callback; this.callback = val => { old(val); next(val); }; next(this.val); } map(f) { const res = new Prop(f(this.get())); this.event(a => res.set(f(a))); return res; } } jQuery.fn.enable = function (x) { // Set the values to enabled/disabled return this.each(function () { if (x) $(this).removeAttr("disabled"); else $(this).attr("disabled", "disabled"); }); }; ///////////////////////////////////////////////////////////////////// // BROWSER HELPER METHODS // Given "?foo=bar&baz=1" returns {foo:"bar",baz:"1"} function uriQueryParameters(s) { // From https://stackoverflow.com/questions/901115/get-querystring-values-with-jquery/3867610#3867610 const params = {}; const a = /\+/g; // Regex for replacing addition symbol with a space const r = /([^&=]+)=?([^&]*)/g; const d = (x) => decodeURIComponent(x.replace(a, " ")); const q = s.substring(1); while (true) { const e = r.exec(q); if (!e) break; params[d(e[1])] = d(e[2]); } return params; } ///////////////////////////////////////////////////////////////////// // STRING FORMATTING function showTime(x) { function digits(x) { const s = String(x); return s.length === 1 ? "0" + s : s; } if (x >= 3600) { x = Math.round(x / 60); return Math.floor(x / 60) + "h" + digits(x % 60) + "m"; } else if (x >= 60) { x = Math.round(x); return Math.floor(x / 60) + "m" + digits(x % 60) + "s"; } else return x.toFixed(2) + "s"; } function showPerc(x) { return (x * 100).toFixed(2) + "%"; } function showInt(x) { // From https://stackoverflow.com/questions/2901102/how-to-print-a-number-with-commas-as-thousands-separators-in-javascript // Show, with commas return x.toString().replace(/\B(?=(\d{3})+(?!\d))/g, ","); } function showRun(run) { return run === 0 ? "Latest run" : run + " run" + plural(run) + " ago"; } function plural(n, not1 = "s", is1 = "") { return n === 1 ? is1 : not1; } ///////////////////////////////////////////////////////////////////// // MISC function compareFst(a, b) { return a[0] - b[0]; } function compareSnd(a, b) { return a[1] - b[1]; } function compareSndRev(a, b) { return b[1] - a[1]; } function pair(a, b) { return [a, b]; } function triple(a, b, c) { return [a, b, c]; } function fst([x, _]) { return x; } function snd([_, x]) { return x; } function execRegExp(r, s) { if (typeof r === "string") return s.indexOf(r) === -1 ? null : []; else return r.exec(s); } function cache(key, op) { const store = {}; return k => { const s = key(k); if (!(s in store)) store[s] = op(k); return store[s]; }; } function lazy(thunk) { let store = null; let done = false; return () => { if (!done) { store = thunk(); done = true; } return store; }; } Array.prototype.sum = function () { let res = 0; for (const x of this) res += x; return res; }; Array.prototype.insertSorted = function (x, compare) { let start = 0; let stop = this.length - 1; let middle = 0; while (start <= stop) { middle = Math.floor((start + stop) / 2); if (compare(this[middle], x) > 0) stop = middle - 1; else start = middle + 1; } this.splice(start, 0, x); return this; }; Array.prototype.concatLength = function () { let res = 0; for (const x of this) res += x.length; return res; }; Array.prototype.sortOn = function (f) { return this.map(x => pair(f(x), x)).sort(compareFst).map(snd); }; Array.prototype.last = function () { return this[this.length - 1]; }; Array.prototype.maximum = function (def) { if (this.length === 0) return def; let res = this[0]; for (let i = 1; i < this.length; i++) res = Math.max(res, this[i]); return res; }; Array.prototype.minimum = function (def) { if (this.length === 0) return def; let res = this[0]; for (let i = 1; i < this.length; i++) res = Math.min(res, this[i]); return res; }; // Use JSX with el instead of React.createElement // Originally from https://gist.github.com/sergiodxa/a493c98b7884128081bb9a281952ef33 // our element factory function createElement(type, props, ...children) { const element = document.createElement(type); for (const name in props || {}) { if (name.substr(0, 2) === "on") element.addEventListener(name.substr(2), props[name]); else element.setAttribute(name, props[name]); } for (const child of children.flat(10)) { const c = typeof child === "object" ? child : document.createTextNode(child.toString()); element.appendChild(c); } return element; } // How .tsx gets desugared const React = { createElement }; function reportCmdPlot(profile) { // first find the end point const runs = findRuns(profile); if (runs.length === 0) { return React.createElement("div", null, React.createElement("h2", null, "No data found"), React.createElement("p", null, "The Shake database contains no rules which ran traced commands."), React.createElement("p", null, "You can populate this information by using ", varLink("cmd"), " or wrapping your ", React.createElement("tt", null, "IO"), " actions in ", varLink("traced"), ".")); } const combo = React.createElement("select", null, runs.map(([run, time], i) => React.createElement("option", null, showRun(run) + " (" + showTime(time) + ") ", i === 0 ? "" : " - may be incomplete")), ";"); const warning = React.createElement("i", null); const plot = React.createElement("div", { style: "width:100%; height:100%;" }); const plotData = new Prop([]); bindPlot(plot, plotData, { legend: { show: true, position: "nw", sorted: "reverse" }, series: { stack: true, lines: { fill: 1, lineWidth: 0 } }, yaxis: { min: 0 }, xaxis: { tickFormatter: showTime } }); function setPlotData(runsIndex) { const [run, end] = runs[runsIndex]; const profileRun = profile.filter(p => p.built === run); // Make sure we max(0,) every step in the process, in case one does parallelism of threads const missing = profileRun.map(untraced).sum(); $(warning).text(missing < 1 ? "" : "Warning: " + showTime(missing) + " of execution was not traced."); const series = calcPlotData(end, profileRun, 100); const res = []; for (const s in series) res.push({ label: s, data: series[s].map((x, i) => pair(end * i / 100, x)) }); plotData.set(res); } setPlotData(0); $(combo).change(() => setPlotData(combo.selectedIndex)); return React.createElement("table", { class: "fill" }, React.createElement("tr", null, React.createElement("td", { width: "100%", style: "text-align:center;" }, React.createElement("h2", null, "Number of commands executing over time")), React.createElement("td", null, combo)), React.createElement("tr", null, React.createElement("td", { height: "100%", colspan: "2" }, plot)), React.createElement("tr", null, React.createElement("td", { colspan: "2", style: "text-align:center;" }, "Time since the start of building. ", warning))); } // Find which runs had traced commands and when the last stopped, sort so most recent first function findRuns(profile) { const runs = {}; for (const p of profile) { if (p.traces.length > 0) { if (p.traces.length === 1 && p.traces[0].command === "") continue; // the fake end command const old = runs[p.built]; const end = p.traces.last().stop; runs[p.built] = old === undefined ? end : Math.max(old, end); } } const runsList = []; for (const i in runs) runsList.push(pair(Number(i), runs[i])); runsList.sort(compareFst); return runsList; } function calcPlotData(end, profile, buckets) { const ans = {}; for (const p of profile) { for (const t of p.traces) { let xs; if (t.command in ans) xs = ans[t.command]; else { xs = []; for (let i = 0; i < buckets; i++) xs.push(0); // fill with 1 more element, but the last bucket will always be 0 ans[t.command] = xs; } const start = t.start * buckets / end; const stop = t.stop * buckets / end; if (Math.floor(start) === Math.floor(stop)) xs[Math.floor(start)] += stop - start; else { for (let j = Math.ceil(start); j < Math.floor(stop); j++) xs[j]++; xs[Math.floor(start)] += Math.ceil(start) - start; xs[Math.floor(stop)] += stop - Math.floor(stop); } } } return ans; } function reportCmdTable(profile, search) { const columns = [{ field: "name", label: "Name", width: 200 }, { field: "count", label: "Count", width: 65, alignRight: true, show: showInt }, { field: "total", label: "Total", width: 75, alignRight: true, show: showTime }, { field: "average", label: "Average", width: 75, alignRight: true, show: showTime }, { field: "max", label: "Max", width: 75, alignRight: true, show: showTime } ]; return newTable(columns, search.map(cmdData), "total", true); } function cmdData(search) { const res = {}; search.forEachProfile(p => { for (const t of p.traces) { const time = t.stop - t.start; if (t.command === "") continue; // do nothing else if (!(t.command in res)) res[t.command] = { count: 1, total: time, max: time }; else { const ans = res[t.command]; ans.count++; ans.total += time; ans.max = Math.max(ans.max, time); } } }); const res2 = []; for (const i in res) res2.push({ name: i, average: res[i].total / res[i].count, ...res[i] }); return res2; } function reportDetails(profile, search) { const result = React.createElement("div", { class: "details" }); const self = new Prop(0); search.event(xs => self.set(xs.mapProfile((p, _) => p.index).maximum())); const f = (i) => React.createElement("a", { onclick: () => self.set(i) }, profile[i].name); self.event(i => { const p = profile[i]; const content = React.createElement("ul", null, React.createElement("li", null, React.createElement("b", null, "Name:"), " ", p.name), React.createElement("li", null, React.createElement("b", null, "Built:"), " ", showRun(p.built)), React.createElement("li", null, React.createElement("b", null, "Changed:"), " ", showRun(p.changed)), React.createElement("li", null, React.createElement("b", null, "Execution time:"), showTime(p.execution)), React.createElement("li", null, React.createElement("b", null, "Traced commands:"), React.createElement("ol", null, p.traces.map(t => React.createElement("li", null, t.command, " took ", showTime(t.stop - t.start))))), React.createElement("li", null, React.createElement("b", null, "Dependencies:"), React.createElement("ol", null, p.depends.map(ds => React.createElement("li", null, React.createElement("ul", null, ds.map(d => React.createElement("li", null, f(d)))))))), React.createElement("li", null, React.createElement("b", null, "Things that depend on me:"), React.createElement("ul", null, p.rdepends.map(d => React.createElement("li", null, f(d)))))); $(result).empty().append(content); }); return result; } function reportParallelism(profile) { // now simulate for -j1 .. -j24 const plotData = [{ label: "Realistic (based on current dependencies)", data: [], color: "#3131a7" }, { label: "Ideal (if no dependencies and perfect speedup)", data: [], color: "green" }, { label: "Gap", data: [], color: "orange" } ]; let threads1; for (let threads = 1; threads <= 24; threads++) { const taken = simulateThreads(profile, threads)[0]; if (threads === 1) threads1 = taken; plotData[0].data.push([threads, taken]); plotData[1].data.push([threads, threads1 / threads]); plotData[2].data.push([threads, Math.max(0, taken - (threads1 / threads))]); } const plot = React.createElement("div", { style: "width:100%; height:100%;" }); bindPlot(plot, new Prop(plotData), { xaxis: { tickDecimals: 0 }, yaxis: { min: 0, tickFormatter: showTime } }); return React.createElement("table", { class: "fill" }, React.createElement("tr", null, React.createElement("td", { style: "text-align:center;" }, React.createElement("h2", null, "Time to build at different number of threads"))), React.createElement("tr", null, React.createElement("td", { height: "100%" }, plot)), React.createElement("tr", null, React.createElement("td", { style: "text-align:center;" }, "Number of threads available."))); } // Simulate running N threads over the profile, return: // [total time take, point at which each entry kicked off] function simulateThreads(profile, threads) { // How far are we through this simulation let timestamp = 0; // Who is currently running, with the highest seconds FIRST const running = []; const started = []; // Things that are done const ready = profile.filter(x => x.depends.length === 0); const waiting = profile.map(x => x.depends.concatLength()); // number I am waiting on before I am done function runningWait() { const [ind, time] = running.pop(); timestamp = time; for (const d of profile[ind].rdepends) { waiting[d]--; if (waiting[d] === 0) ready.push(profile[d]); } } while (true) { // Queue up as many people as we can while (running.length < threads && ready.length > 0) { const p = ready.pop(); started[p.index] = timestamp; running.insertSorted([p.index, timestamp + p.execution], compareSndRev); } if (running.length === 0) { if (waiting.maximum(0) > 0) throw new Error("Failed to run all tasks"); return [timestamp, started]; } runningWait(); } } function reportRebuild(profile, search) { const depth = []; for (const p of profile) { depth[p.index] = p.depends.flat().map(d => depth[d] + 1).maximum(0); } const ind = search.get().mapProfile((p, _) => p.index).sortOn(i => -depth[i])[0]; const p = profile[ind]; function f(p) { const res = []; while (p.depends.length !== 0) { const ds = p.depends.flat().sortOn(i => -depth[i]); res.push(React.createElement("li", null, React.createElement("select", { style: "width:400px;" }, ds.slice(0, 1).map(x => React.createElement("option", null, profile[x].name))))); p = profile[ds[0]]; } return res; } return React.createElement("div", null, React.createElement("h2", null, "Why did it rebuild?"), React.createElement("p", null, "Rule ", p.name + " " + (p.built === 0 ? "rebuild in the last run" : "did not rebuild")), React.createElement("ul", null, f(p))); } function reportRuleTable(profile, search) { const [etimes, wtimes] = calcEWTimes(profile, 24); const columns = [{ field: "name", label: "Name", width: 400 }, { field: "count", label: "Count", width: 65, alignRight: true, show: showInt }, { field: "leaf", label: "Leaf", width: 60, alignRight: true }, { field: "run", label: "Run", width: 50, alignRight: true }, { field: "changed", label: "Change", width: 60, alignRight: true }, { field: "time", label: "Time", width: 75, alignRight: true, show: showTime }, { field: "etime", label: "ETime", width: 75, alignRight: true, show: showTime }, { field: "wtime", label: "WTime", width: 75, alignRight: true, show: showTime }, { field: "untraced", label: "Untraced", width: 100, alignRight: true, show: showTime } ]; return newTable(columns, search.map(s => ruleData(etimes, wtimes, s)), "time", true); } // Calculate the exclusive time of each rule at some number of threads function calcEWTimes(profile, threads) { const [_, started] = simulateThreads(profile, threads); const starts = started.map((s, i) => pair(i, s)).sort(compareSnd); const costs = starts.map(([ind, start], i) => { // find out who else runs before I finish const execution = profile[ind].execution; const end = start + execution; let overlap = 0; // how much time I am overlapped for let exclusive = 0; // how much time I am the only runner let finisher = start; // the first overlapping person to finish for (let j = i + 1; j < starts.length; j++) { const [jInd, jStarts] = starts[j]; if (jStarts > end) break; overlap += Math.min(end - jStarts, profile[jInd].execution); exclusive += Math.max(0, Math.min(jStarts, end) - finisher); finisher = Math.max(finisher, jStarts + profile[jInd].execution); } exclusive += Math.max(0, end - finisher); return triple(ind, execution === 0 ? 0 : execution * (execution / (execution + overlap)), exclusive); }); const etimes = []; const wtimes = []; for (const [ind, etime, wtime] of costs) { etimes[ind] = etime; wtimes[ind] = wtime; } return [etimes, wtimes]; } function ruleData(etimes, wtimes, search) { return search.mapProfiles((ps, name) => ({ name, count: ps.length, leaf: ps.every(p => p.depends.length === 0), run: ps.map(p => p.built).minimum(), changed: ps.some(p => p.built === p.changed), time: ps.map(p => p.execution).sum(), etime: ps.map(p => etimes[p.index]).sum(), wtime: ps.map(p => wtimes[p.index]).sum(), untraced: ps.map(untraced).sum() })); } function reportSummary(profile) { let countLast = 0; // number of rules run in the last run let highestRun = 0; // highest run you have seen (add 1 to get the count of runs) let sumExecution = 0; // build time in total let sumExecutionLast = 0; // build time in total let countTrace = -1; let countTraceLast = -1; // traced commands run // start both are -1 because the end command will have run in the previous step let maxTraceStopLast = 0; // time the last traced command stopped for (const p of profile) { sumExecution += p.execution; highestRun = Math.max(highestRun, p.changed); // changed is always greater or equal to built countTrace += p.traces.length; if (p.built === 0) { sumExecutionLast += p.execution; countLast++; countTraceLast += p.traces.length; if (p.traces.length > 0) maxTraceStopLast = Math.max(maxTraceStopLast, p.traces.last().stop); } } return React.createElement("div", null, React.createElement("h2", null, "Totals"), React.createElement("ul", null, React.createElement("li", null, React.createElement("b", null, "Runs:"), " ", showInt(highestRun + 1), " ", React.createElement("span", { class: "note" }, "number of times Shake has been run.")), React.createElement("li", null, React.createElement("b", null, "Rules:"), " ", showInt(profile.length), " (", showInt(countLast), " in last run) ", React.createElement("span", { class: "note" }, "number of defined rules, e.g. individual files.")), React.createElement("li", null, React.createElement("b", null, "Traced:"), " ", showInt(countTrace), " (", showInt(countTraceLast), " in last run)", React.createElement("span", { class: "note" }, "number of calls to ", varLink("cmd"), " or ", varLink("traced"), "."))), React.createElement("h2", null, "Performance"), React.createElement("ul", null, React.createElement("li", null, React.createElement("b", null, "Build time:"), " ", showTime(sumExecution), " ", React.createElement("span", { class: "note" }, "how long a complete build would take single threaded.")), React.createElement("li", null, React.createElement("b", null, "Last build time:"), " ", showTime(maxTraceStopLast), " ", React.createElement("span", { class: "note" }, "how long the last build take.")), React.createElement("li", null, React.createElement("b", null, "Parallelism:"), " ", (maxTraceStopLast === 0 ? 0 : sumExecutionLast / maxTraceStopLast).toFixed(2), " ", React.createElement("span", { class: "note" }, "average number of commands executing simultaneously in the last build.")), React.createElement("li", null, React.createElement("b", null, "Speculative critical path:"), " ", showTime(speculativeCriticalPath(profile)), " ", React.createElement("span", { class: "note" }, "how long it would take on infinite CPUs.")), React.createElement("li", null, React.createElement("b", null, "Precise critical path:"), " ", showTime(preciseCriticalPath(profile)), " ", React.createElement("span", { class: "note" }, "critical path not speculatively executing.")))); } function speculativeCriticalPath(profile) { const criticalPath = []; // the critical path to any element let maxCriticalPath = 0; for (const p of profile) { let cost = 0; for (const ds of p.depends) for (const d of ds) cost = Math.max(cost, criticalPath[d]); cost += p.execution; maxCriticalPath = Math.max(cost, maxCriticalPath); criticalPath[p.index] = cost; } return maxCriticalPath; } /* Calculating a precise critical path, taking into account the deep dependeny structure, is non-obvious. Dependencies have the type [{X}], e.g: X = [{a,b},{c,d}] That is r builds a and b, then after those both complete (assuming they don't change), it builds c and d, then it is finished. Importantly, r doesn't start building c/d until after a and b have finished. This detail extends the critical path. To calculate the precise critical path, we simulate with the notion of demand and waiting. */ function preciseCriticalPath(profile) { const waiting = profile.map(x => x.depends.concatLength()); // number I am waiting on before I am done const demanded = []; // I have been demanded by someone const oncomplete = []; // Completion functions const complete = []; // Who is complete already const running = []; let timestamp = 0; // demand dependency set N of a rule function demandN(p, round) { for (; round < p.depends.length; round++) { let todo = p.depends[round].length; // Number before we continue const step = () => { todo--; if (todo === 0) demandN(p, round + 1); }; for (const d of p.depends[round]) { if (complete[d]) todo--; else { const old = oncomplete[d]; oncomplete[d] = !old ? step : () => { old(); step(); }; demand(profile[d]); } } if (todo !== 0) break; // todo === 0, so continue (equivalent to calling step but tail recursive) } } // demand a particular rule function demand(p) { if (demanded[p.index]) return; demanded[p.index] = true; if (waiting[p.index] === 0) running.insertSorted([p.index, timestamp + p.execution], compareSndRev); else demandN(p, 0); } // We don't know the targets we ask for, so we approximate by saying the ones which nothing depends on for (const p of profile) { if (p.rdepends.length === 0) demand(p); } while (running.length > 0) { const [ind, time] = running.pop(); timestamp = time; complete[ind] = true; if (oncomplete[ind]) { oncomplete[ind](); delete oncomplete[ind]; } for (const d of profile[ind].rdepends) { waiting[d]--; if (waiting[d] === 0 && demanded[d]) running.insertSorted([d, timestamp + profile[d].execution], compareSndRev); } } for (let i = 0; i < profile.length; i++) if (!complete[i]) throw new Error("Failed to run all tasks"); return timestamp; } shake-0.19.8/shake.cabal0000644000000000000000000003627707346545000013173 0ustar0000000000000000cabal-version: 1.18 build-type: Simple name: shake version: 0.19.8 license: BSD3 license-file: LICENSE category: Development, Shake author: Neil Mitchell maintainer: Neil Mitchell copyright: Neil Mitchell 2011-2024 synopsis: Build system library, like Make, but more accurate dependencies. description: Shake is a Haskell library for writing build systems - designed as a replacement for @make@. See "Development.Shake" for an introduction, including an example. The homepage contains links to a user manual, an academic paper and further information: . To use Shake the user writes a Haskell program that imports "Development.Shake", defines some build rules, and calls the 'Development.Shake.shakeArgs' function. Thanks to do notation and infix operators, a simple Shake build system is not too dissimilar from a simple Makefile. However, as build systems get more complex, Shake is able to take advantage of the excellent abstraction facilities offered by Haskell and easily support much larger projects. The Shake library provides all the standard features available in other build systems, including automatic parallelism and minimal rebuilds. Shake also provides more accurate dependency tracking, including seamless support for generated files, and dependencies on system information (e.g. compiler version). homepage: https://shakebuild.com bug-reports: https://github.com/ndmitchell/shake/issues tested-with: GHC==9.8, GHC==9.6, GHC==9.4, GHC==9.2, GHC==9.0, GHC==8.10, GHC==8.8 extra-doc-files: CHANGES.txt README.md docs/Manual.md docs/shake-progress.png extra-source-files: src/Paths.hs src/Test/C/constants.c src/Test/C/constants.h src/Test/C/main.c src/Test/Ninja/*.ninja src/Test/Ninja/*.output src/Test/Ninja/subdir/*.ninja src/Test/Progress/*.prog src/Test/Tar/list.txt src/Test/Tup/hello.c src/Test/Tup/newmath/root.cfg src/Test/Tup/newmath/square.c src/Test/Tup/newmath/square.h src/Test/Tup/root.cfg data-files: docs/manual/build.bat docs/manual/Shakefile.hs docs/manual/build.sh docs/manual/constants.c docs/manual/constants.h docs/manual/main.c html/profile.html html/progress.html html/shake.js source-repository head type: git location: https://github.com/ndmitchell/shake.git flag portable default: False manual: True description: Obtain FileTime using portable functions flag cloud default: False manual: True description: Enable cloud build features flag embed-files default: False manual: True description: Embed data files into the shake library flag threaded default: True manual: True description: Build shake with the threaded RTS library default-language: Haskell2010 hs-source-dirs: src build-depends: base >= 4.9, binary, bytestring, deepseq >= 1.1, directory >= 1.2.7.0, extra >= 1.6.19, filepath >= 1.4, filepattern, hashable >= 1.1.2.3, heaps >= 0.3.6.1, js-dgtable, js-flot, js-jquery, primitive, process >= 1.1, random, time, transformers >= 0.2, unordered-containers >= 0.2.7, utf8-string >= 0.3 if flag(embed-files) cpp-options: -DFILE_EMBED build-depends: file-embed >= 0.0.11, template-haskell if flag(portable) cpp-options: -DPORTABLE else if !os(windows) build-depends: unix >= 2.5.1 if !os(windows) build-depends: unix if flag(cloud) cpp-options: -DNETWORK build-depends: network, network-uri exposed-modules: Development.Shake Development.Shake.Classes Development.Shake.Command Development.Shake.Config Development.Shake.Database Development.Shake.FilePath Development.Shake.Forward Development.Shake.Rule Development.Shake.Util other-modules: Development.Ninja.Env Development.Ninja.Lexer Development.Ninja.Parse Development.Ninja.Type Development.Shake.Internal.Args Development.Shake.Internal.CmdOption Development.Shake.Internal.CompactUI Development.Shake.Internal.Core.Action Development.Shake.Internal.Core.Build Development.Shake.Internal.Core.Database Development.Shake.Internal.History.Shared Development.Shake.Internal.History.Symlink Development.Shake.Internal.History.Bloom Development.Shake.Internal.History.Cloud Development.Shake.Internal.History.Network Development.Shake.Internal.History.Server Development.Shake.Internal.History.Serialise Development.Shake.Internal.History.Types Development.Shake.Internal.Core.Monad Development.Shake.Internal.Core.Pool Development.Shake.Internal.Core.Rules Development.Shake.Internal.Core.Run Development.Shake.Internal.Core.Storage Development.Shake.Internal.Core.Types Development.Shake.Internal.Demo Development.Shake.Internal.Derived Development.Shake.Internal.Errors Development.Shake.Internal.FileInfo Development.Shake.Internal.FileName Development.Shake.Internal.FilePattern Development.Shake.Internal.Options Development.Shake.Internal.Paths Development.Shake.Internal.Profile Development.Shake.Internal.Progress Development.Shake.Internal.Resource Development.Shake.Internal.Rules.Default Development.Shake.Internal.Rules.Directory Development.Shake.Internal.Rules.File Development.Shake.Internal.Rules.Files Development.Shake.Internal.Rules.Oracle Development.Shake.Internal.Rules.OrderOnly Development.Shake.Internal.Rules.Rerun Development.Shake.Internal.Value General.Bilist General.Binary General.Chunks General.Cleanup General.Fence General.EscCodes General.Extra General.FileLock General.GetOpt General.Ids General.Intern General.ListBuilder General.Makefile General.Pool General.Process General.Template General.Thread General.Timing General.TypeMap General.Wait Paths_shake executable shake default-language: Haskell2010 hs-source-dirs: src ghc-options: -main-is Run.main -rtsopts if flag(threaded) ghc-options: -threaded "-with-rtsopts=-I0 -qg" main-is: Run.hs build-depends: base == 4.*, binary, bytestring, deepseq >= 1.1, directory, extra >= 1.6.19, filepath, filepattern, hashable >= 1.1.2.3, heaps >= 0.3.6.1, js-dgtable, js-flot, js-jquery, primitive, process >= 1.1, random, time, transformers >= 0.2, unordered-containers >= 0.2.7, utf8-string >= 0.3 if flag(embed-files) cpp-options: -DFILE_EMBED build-depends: file-embed >= 0.0.11, template-haskell if flag(portable) cpp-options: -DPORTABLE else if !os(windows) build-depends: unix >= 2.5.1 if !os(windows) build-depends: unix if flag(cloud) cpp-options: -DNETWORK build-depends: network, network-uri if impl(ghc < 8.0) build-depends: semigroups >= 0.18 other-modules: Development.Ninja.All Development.Ninja.Env Development.Ninja.Lexer Development.Ninja.Parse Development.Ninja.Type Development.Shake Development.Shake.Classes Development.Shake.Command Development.Shake.Database Development.Shake.FilePath Development.Shake.Internal.Args Development.Shake.Internal.CmdOption Development.Shake.Internal.CompactUI Development.Shake.Internal.Core.Action Development.Shake.Internal.Core.Build Development.Shake.Internal.Core.Database Development.Shake.Internal.History.Shared Development.Shake.Internal.History.Symlink Development.Shake.Internal.History.Bloom Development.Shake.Internal.History.Cloud Development.Shake.Internal.History.Network Development.Shake.Internal.History.Server Development.Shake.Internal.History.Serialise Development.Shake.Internal.History.Types Development.Shake.Internal.Core.Monad Development.Shake.Internal.Core.Pool Development.Shake.Internal.Core.Rules Development.Shake.Internal.Core.Run Development.Shake.Internal.Core.Storage Development.Shake.Internal.Core.Types Development.Shake.Internal.Demo Development.Shake.Internal.Derived Development.Shake.Internal.Errors Development.Shake.Internal.FileInfo Development.Shake.Internal.FileName Development.Shake.Internal.FilePattern Development.Shake.Internal.Options Development.Shake.Internal.Paths Development.Shake.Internal.Profile Development.Shake.Internal.Progress Development.Shake.Internal.Resource Development.Shake.Internal.Rules.Default Development.Shake.Internal.Rules.Directory Development.Shake.Internal.Rules.File Development.Shake.Internal.Rules.Files Development.Shake.Internal.Rules.Oracle Development.Shake.Internal.Rules.OrderOnly Development.Shake.Internal.Rules.Rerun Development.Shake.Internal.Value General.Bilist General.Binary General.Chunks General.Cleanup General.Fence General.EscCodes General.Extra General.FileLock General.GetOpt General.Ids General.Intern General.ListBuilder General.Makefile General.Pool General.Process General.Template General.Thread General.Timing General.TypeMap General.Wait Paths_shake test-suite shake-test default-language: Haskell2010 type: exitcode-stdio-1.0 main-is: Test.hs hs-source-dirs: src ghc-options: -main-is Test.main -rtsopts -with-rtsopts=-K1K if flag(threaded) ghc-options: -threaded build-depends: base == 4.*, binary, bytestring, deepseq >= 1.1, directory, extra >= 1.6.19, filepath, filepattern, hashable >= 1.1.2.3, heaps >= 0.3.6.1, js-dgtable, js-flot, js-jquery, primitive, process >= 1.1, QuickCheck >= 2.0, random, time, transformers >= 0.2, unordered-containers >= 0.2.7, utf8-string >= 0.3 if flag(embed-files) cpp-options: -DFILE_EMBED build-depends: file-embed >= 0.0.11, template-haskell if flag(portable) cpp-options: -DPORTABLE else if !os(windows) build-depends: unix >= 2.5.1 if !os(windows) build-depends: unix if flag(cloud) cpp-options: -DNETWORK build-depends: network, network-uri if impl(ghc < 8.0) build-depends: semigroups >= 0.18 other-modules: Development.Ninja.All Development.Ninja.Env Development.Ninja.Lexer Development.Ninja.Parse Development.Ninja.Type Development.Shake Development.Shake.Classes Development.Shake.Command Development.Shake.Config Development.Shake.Database Development.Shake.FilePath Development.Shake.Forward Development.Shake.Internal.Args Development.Shake.Internal.CmdOption Development.Shake.Internal.CompactUI Development.Shake.Internal.Core.Action Development.Shake.Internal.Core.Build Development.Shake.Internal.Core.Database Development.Shake.Internal.History.Shared Development.Shake.Internal.History.Symlink Development.Shake.Internal.History.Bloom Development.Shake.Internal.History.Cloud Development.Shake.Internal.History.Network Development.Shake.Internal.History.Server Development.Shake.Internal.History.Serialise Development.Shake.Internal.History.Types Development.Shake.Internal.Core.Monad Development.Shake.Internal.Core.Pool Development.Shake.Internal.Core.Rules Development.Shake.Internal.Core.Run Development.Shake.Internal.Core.Storage Development.Shake.Internal.Core.Types Development.Shake.Internal.Demo Development.Shake.Internal.Derived Development.Shake.Internal.Errors Development.Shake.Internal.FileInfo Development.Shake.Internal.FileName Development.Shake.Internal.FilePattern Development.Shake.Internal.Options Development.Shake.Internal.Paths Development.Shake.Internal.Profile Development.Shake.Internal.Progress Development.Shake.Internal.Resource Development.Shake.Internal.Rules.Default Development.Shake.Internal.Rules.Directory Development.Shake.Internal.Rules.File Development.Shake.Internal.Rules.Files Development.Shake.Internal.Rules.Oracle Development.Shake.Internal.Rules.OrderOnly Development.Shake.Internal.Rules.Rerun Development.Shake.Internal.Value Development.Shake.Rule Development.Shake.Util General.Bilist General.Binary General.Chunks General.Cleanup General.Fence General.EscCodes General.Extra General.FileLock General.GetOpt General.Ids General.Intern General.ListBuilder General.Makefile General.Pool General.Process General.Template General.Thread General.Timing General.TypeMap General.Wait Paths_shake Run Test.Basic Test.Batch Test.Benchmark Test.Builtin Test.BuiltinOverride Test.C Test.Cache Test.Cleanup Test.CloseFileHandles Test.Command Test.Config Test.Database Test.Digest Test.Directory Test.Docs Test.Errors Test.Existence Test.FileLock Test.FilePath Test.FilePattern Test.Files Test.Forward Test.History Test.Journal Test.Lint Test.Live Test.Manual Test.Match Test.Monad Test.Ninja Test.Oracle Test.OrderOnly Test.Parallel Test.Pool Test.Progress Test.Random Test.Rebuild Test.Reschedule Test.Resources Test.Self Test.SelfMake Test.Tar Test.Targets Test.Thread Test.Tup Test.Type Test.Unicode Test.Util Test.Verbosity Test.Version shake-0.19.8/src/Development/Ninja/0000755000000000000000000000000007346545000015205 5ustar0000000000000000shake-0.19.8/src/Development/Ninja/All.hs0000644000000000000000000003275707346545000016267 0ustar0000000000000000{-# LANGUAGE RecordWildCards, ViewPatterns, ScopedTypeVariables, TupleSections #-} module Development.Ninja.All(runNinja) where import Development.Ninja.Env import Development.Ninja.Type import Development.Ninja.Parse import Development.Shake hiding (addEnv) import Development.Shake.Classes import qualified Data.ByteString as BS8 import qualified Data.ByteString.Char8 as BS import System.Directory import qualified Data.HashMap.Strict as Map import qualified Data.HashSet as Set import Data.Tuple.Extra import Control.Exception.Extra import Control.Monad import Data.Maybe import Data.Char import Data.List.Extra import System.Info.Extra -- Internal imports import General.Extra(removeFile_, headErr) import General.Timing(addTiming) import General.Makefile(parseMakefile) import Development.Shake.Internal.FileName(filepathNormalise, fileNameFromString) import Development.Shake.Internal.FileInfo(getFileInfo) import Development.Shake.Internal.Errors(throwM, errorStructured) import Development.Shake.Internal.Rules.File(needBS, neededBS) import Development.Shake.Internal.Rules.OrderOnly(orderOnlyBS) -- | Given the Ninja source file, a list of file arguments, a tool name. -- Return a bool if you should restart and the rules. runNinja :: IO () -> FilePath -> [String] -> Maybe String -> IO (Maybe (Rules ())) runNinja _ file args (Just "compdb") = do dir <- getCurrentDirectory Ninja{..} <- parse file =<< newEnv rules<- pure $ Map.fromList [r | r <- rules, BS.unpack (fst r) `elem` args] -- the build items are generated in reverse order, hence the reverse let xs = [(a,b,file,rule) | (a,b@Build{..}) <- reverse $ multiples ++ map (first pure) singles , Just rule <- [Map.lookup ruleName rules], file:_ <- [depsNormal]] xs <- forM xs $ \(out,Build{..},file,Rule{..}) -> do -- the order of adding new environment variables matters env <- scopeEnv env addEnv env (BS.pack "out") (BS.unwords $ map quote out) addEnv env (BS.pack "in") (BS.unwords $ map quote depsNormal) addEnv env (BS.pack "in_newline") (BS.unlines depsNormal) forM_ buildBind $ \(a,b) -> addEnv env a b addBinds env ruleBind commandline <- fmap BS.unpack $ askVar env $ BS.pack "command" pure $ CompDb dir commandline $ BS.unpack file putStr $ printCompDb xs pure Nothing runNinja _ _ _ (Just x) = errorIO $ "Unknown tool argument, expected 'compdb', got " ++ x runNinja restart file args Nothing = do addTiming "Ninja parse" ninja@Ninja{..} <- parse file =<< newEnv pure $ Just $ do phonys<- pure $ Map.fromList phonys needDeps<- pure $ needDeps ninja phonys -- partial application singles<- pure $ Map.fromList $ map (first filepathNormalise) singles multiples<- pure $ Map.fromList [(x,(xs,b)) | (xs,b) <- map (first $ map filepathNormalise) multiples, x <- xs] rules<- pure $ Map.fromList rules pools <- fmap Map.fromList $ forM ((BS.pack "console",1):pools) $ \(name,depth) -> (name,) <$> newResource (BS.unpack name) depth action $ do -- build the .ninja files, if they change, restart the build before <- liftIO $ mapM (getFileInfo False . fileNameFromString) sources need sources after <- liftIO $ mapM (getFileInfo False . fileNameFromString) sources if before /= after then runAfter restart else needBS $ concatMap (resolvePhony phonys) $ if not $ null args then map BS.pack args else if not $ null defaults then defaults else Map.keys singles ++ Map.keys multiples (\x -> map BS.unpack . fst <$> Map.lookup (BS.pack x) multiples) &?> \out -> let out2 = map BS.pack out in build needDeps phonys rules pools out2 $ snd $ multiples Map.! headErr out2 (flip Map.member singles . BS.pack) ?> \out -> let out2 = BS.pack out in build needDeps phonys rules pools [out2] $ singles Map.! out2 resolvePhony :: Map.HashMap Str [Str] -> Str -> [Str] resolvePhony mp = f $ Left 100 where f (Left 0) x = f (Right []) x f (Right xs) x | x `elem` xs = error $ "Ninja recursive phony involving " ++ BS.unpack x f a x = case Map.lookup x mp of Nothing -> [x] Just xs -> concatMap (f $ either (Left . subtract 1) (Right . (x:)) a) xs quote :: Str -> Str quote x | BS.any isSpace x = let q = BS.singleton '\"' in BS.concat [q,x,q] | otherwise = x build :: (Build -> [Str] -> Action ()) -> Map.HashMap Str [Str] -> Map.HashMap Str Rule -> Map.HashMap Str Resource -> [Str] -> Build -> Action () build needDeps phonys rules pools out build@Build{..} = do needBS $ concatMap (resolvePhony phonys) $ depsNormal ++ depsImplicit orderOnlyBS $ concatMap (resolvePhony phonys) depsOrderOnly case Map.lookup ruleName rules of Nothing -> liftIO $ errorIO $ "Ninja rule named " ++ BS.unpack ruleName ++ " is missing, required to build " ++ BS.unpack (BS.unwords out) Just Rule{..} -> do env <- liftIO $ scopeEnv env liftIO $ do -- the order of adding new environment variables matters addEnv env (BS.pack "out") (BS.unwords $ map quote out) addEnv env (BS.pack "in") (BS.unwords $ map quote depsNormal) addEnv env (BS.pack "in_newline") (BS.unlines depsNormal) forM_ buildBind $ \(a,b) -> addEnv env a b addBinds env ruleBind applyRspfile env $ do commandline <- liftIO $ fmap BS.unpack $ askVar env $ BS.pack "command" depfile <- liftIO $ fmap BS.unpack $ askVar env $ BS.pack "depfile" deps <- liftIO $ fmap BS.unpack $ askVar env $ BS.pack "deps" description <- liftIO $ fmap BS.unpack $ askVar env $ BS.pack "description" pool <- liftIO $ askVar env $ BS.pack "pool" let withPool act = case Map.lookup pool pools of _ | BS.null pool -> act Nothing -> liftIO $ errorIO $ "Ninja pool named " ++ BS.unpack pool ++ " not found, required to build " ++ BS.unpack (BS.unwords out) Just r -> withResource r 1 act when (description /= "") $ putInfo description let (cmdOpts, cmdProg, cmdArgs) = toCommand commandline if deps == "msvc" then do Stdout stdout <- withPool $ command cmdOpts cmdProg cmdArgs prefix <- liftIO $ fmap (fromMaybe $ BS.pack "Note: including file: ") $ askEnv env $ BS.pack "msvc_deps_prefix" needDeps build $ parseShowIncludes prefix $ BS.pack stdout else withPool $ command_ cmdOpts cmdProg cmdArgs when (depfile /= "") $ do when (deps /= "gcc") $ need [depfile] depsrc <- liftIO $ BS.readFile depfile needDeps build $ concatMap snd $ parseMakefile depsrc -- correct as per the Ninja spec, but breaks --skip-commands -- when (deps == "gcc") $ liftIO $ removeFile depfile needDeps :: Ninja -> Map.HashMap Str [Str] -> Build -> [Str] -> Action () needDeps Ninja{..} phonysMp = \build xs -> do -- eta reduced so 'builds' is shared opts <- getShakeOptions if isNothing $ shakeLint opts then needBS xs else do neededBS xs -- now try and statically validate needed will never fail -- first find which dependencies are generated files xs<- pure $ filter (`Map.member` builds) xs -- now try and find them as dependencies -- performance note: allDependencies generates lazily, and difference consumes lazily, -- with the property that in the common case it won't generate much of the list at all let bad = xs `difference` allDependencies build case bad of [] -> pure () xs -> throwM $ errorStructured ("Lint checking error - " ++ (if length xs == 1 then "file in deps is" else "files in deps are") ++ " generated and not a pre-dependency") [("File", Just $ BS.unpack x) | x <- xs] "" where builds :: Map.HashMap FileStr Build builds = Map.fromList $ singles ++ [(x,y) | (xs,y) <- multiples, x <- xs] -- do list difference, assuming a small initial set, most of which occurs early in the list difference :: (Eq a, Hashable a) => [a] -> [a] -> [a] difference [] _ = [] difference xs ys = f (Set.fromList xs) ys where f xs [] = Set.toList xs f xs (y:ys) | y `Set.member` xs = let xs2 = Set.delete y xs in if Set.null xs2 then [] else f xs2 ys | otherwise = f xs ys -- find all dependencies of a rule, no duplicates, with all dependencies of this rule listed first allDependencies :: Build -> [FileStr] allDependencies rule = f Set.empty [] [rule] where f _ [] [] = [] f seen [] (x:xs) = f seen (map filepathNormalise $ concatMap (resolvePhony phonysMp) $ depsNormal x ++ depsImplicit x ++ depsOrderOnly x) xs f seen (x:xs) rest | x `Set.member` seen = f seen xs rest | otherwise = x : f (Set.insert x seen) xs (maybeToList (Map.lookup x builds) ++ rest) applyRspfile :: Env Str Str -> Action a -> Action a applyRspfile env act = do rspfile <- liftIO $ fmap BS.unpack $ askVar env $ BS.pack "rspfile" rspfile_content <- liftIO $ askVar env $ BS.pack "rspfile_content" if rspfile == "" then act else flip actionFinally (removeFile_ rspfile) $ do liftIO $ BS.writeFile rspfile rspfile_content act parseShowIncludes :: Str -> Str -> [FileStr] parseShowIncludes prefix out = [y | x <- BS.lines out, prefix `BS.isPrefixOf` x , let y = BS.dropWhile isSpace $ BS.drop (BS.length prefix) x , not $ isSystemInclude y] -- Dodgy, but ported over from the original Ninja isSystemInclude :: FileStr -> Bool isSystemInclude x = bsProgFiles `BS.isInfixOf` tx || bsVisStudio `BS.isInfixOf` tx where tx = BS8.map (\c -> if c >= 97 then c - 32 else c) x -- optimised toUpper that only cares about letters and spaces bsProgFiles = BS.pack "PROGRAM FILES" bsVisStudio = BS.pack "MICROSOFT VISUAL STUDIO" data CompDb = CompDb {cdbDirectory :: String ,cdbCommand :: String ,cdbFile :: String } deriving Show printCompDb :: [CompDb] -> String printCompDb xs = unlines $ ["["] ++ concat (zipWithFrom f 1 xs) ++ ["]"] where n = length xs f i CompDb{..} = [" {" ," \"directory\": " ++ g cdbDirectory ++ "," ," \"command\": " ++ g cdbCommand ++ "," ," \"file\": " ++ g cdbFile ," }" ++ (if i == n then "" else ",")] g = show toCommand :: String -> ([CmdOption], String, [String]) toCommand s -- On POSIX, Ninja does a /bin/sh -c, and so does Haskell in Shell mode (easy). | not isWindows = ([Shell], s, []) -- On Windows, Ninja passes the string directly to CreateProcess, -- but Haskell applies some escaping first. -- We try and get back as close to the original as we can, but it's very hacky | length s < 8000 = -- Using the "cmd" program adds overhead (I measure 7ms), and a limit of 8191 characters, -- but is the most robust, requiring no additional escaping. ([Shell], s, []) | (cmd,s) <- word1 s, upper cmd `elem` ["CMD","CMD.EXE"], ("/c",s) <- word1 s = -- Given "cmd.exe /c " we translate to Shell, which adds cmd.exe -- (looked up on the current path) and /c to the front. CMake uses this rule a lot. -- Adding quotes around pieces are /c goes very wrong. ([Shell], s, []) | otherwise = -- It's a long command line which doesn't call "cmd /c". We reverse the escaping -- Haskell applies, but each argument will still gain quotes around it. let xs = splitArgs s in ([], headDef "" xs, drop1 xs) data State = Gap -- ^ Current in the gap between words | Word -- ^ Currently inside a space-separated argument | Quot -- ^ Currently inside a quote-surrounded argument -- | The process package contains a translate function, reproduced below. The aim is that after command line -- parsing we should get out mostly the same answer. splitArgs :: String -> [String] splitArgs = f Gap where f Gap (x:xs) | isSpace x = f Gap xs f Gap ('\"':xs) = f Quot xs f Gap [] = [] f Gap xs = f Word xs f Word (x:xs) | isSpace x = [] : f Gap xs f Quot ('\"':xs) = [] : f Gap xs f s ('\\':xs) | (length -> a, b) <- span (== '\\') xs = case b of '\"':xs | even a -> add (replicate (a `div` 2) '\\' ++ "\"") $ f s xs | otherwise -> add (replicate ((a+1) `div` 2) '\\') $ f s ('\"':xs) xs -> add (replicate (a+1) '\\') $ f s xs f s (x:xs) = add [x] $ f s xs f _ [] = [[]] add a (b:c) = (a++b):c add a [] = [a] {- translate (cmd,args) = unwords $ f cmd : map f args where f x = '"' : snd (foldr escape (True,"\"") xs) escape '"' (_, str) = (True, '\\' : '"' : str) escape '\\' (True, str) = (True, '\\' : '\\' : str) escape c (_, str) = (False, c : str) -} shake-0.19.8/src/Development/Ninja/Env.hs0000644000000000000000000000202107346545000016264 0ustar0000000000000000{-# LANGUAGE PatternGuards #-} -- | A Ninja style environment, equivalent to a non-empty list of mutable hash tables. module Development.Ninja.Env( Env, newEnv, scopeEnv, addEnv, askEnv, fromEnv ) where import qualified Data.HashMap.Strict as Map import Data.Hashable import Data.IORef data Env k v = Env (IORef (Map.HashMap k v)) (Maybe (Env k v)) instance Show (Env k v) where show _ = "Env" newEnv :: IO (Env k v) newEnv = do ref <- newIORef Map.empty; pure $ Env ref Nothing scopeEnv :: Env k v -> IO (Env k v) scopeEnv e = do ref <- newIORef Map.empty; pure $ Env ref $ Just e addEnv :: (Eq k, Hashable k) => Env k v -> k -> v -> IO () addEnv (Env ref _) k v = modifyIORef ref $ Map.insert k v askEnv :: (Eq k, Hashable k) => Env k v -> k -> IO (Maybe v) askEnv (Env ref e) k = do mp <- readIORef ref case Map.lookup k mp of Just v -> pure $ Just v Nothing | Just e <- e -> askEnv e k _ -> pure Nothing fromEnv :: Env k v -> IO (Map.HashMap k v) fromEnv (Env ref _) = readIORef ref shake-0.19.8/src/Development/Ninja/Lexer.hs0000644000000000000000000001770107346545000016626 0ustar0000000000000000{-# LANGUAGE PatternGuards #-} -- {-# OPTIONS_GHC -O2 #-} -- fails with GHC 7.10 -- {-# OPTIONS_GHC -ddump-simpl #-} -- | Lexing is a slow point, the code below is optimised module Development.Ninja.Lexer(Lexeme(..), lexerFile) where import Data.Tuple.Extra import Data.Char import qualified Data.ByteString.Char8 as BS import qualified Data.ByteString.Unsafe as BS import Development.Ninja.Type import qualified Data.ByteString.Internal as Internal import System.IO.Unsafe import Data.Word import Foreign.Ptr import Foreign.Storable import GHC.Exts --------------------------------------------------------------------- -- LIBRARY BITS newtype Str0 = Str0 Str -- null terminated type S = Ptr Word8 char :: S -> Char char x = Internal.w2c $ unsafePerformIO $ peek x next :: S -> S next x = x `plusPtr` 1 {-# INLINE dropWhile0 #-} dropWhile0 :: (Char -> Bool) -> Str0 -> Str0 dropWhile0 f x = snd $ span0 f x {-# INLINE span0 #-} span0 :: (Char -> Bool) -> Str0 -> (Str, Str0) span0 f = break0 (not . f) {-# INLINE break0 #-} break0 :: (Char -> Bool) -> Str0 -> (Str, Str0) break0 f (Str0 bs) = (BS.unsafeTake i bs, Str0 $ BS.unsafeDrop i bs) where i = unsafePerformIO $ BS.unsafeUseAsCString bs $ \ptr -> do let start = castPtr ptr :: S let end = go start pure $! Ptr end `minusPtr` start go s@(Ptr a) | c == '\0' || f c = a | otherwise = go (next s) where c = char s {-# INLINE break00 #-} -- The predicate must return true for '\0' break00 :: (Char -> Bool) -> Str0 -> (Str, Str0) break00 f (Str0 bs) = (BS.unsafeTake i bs, Str0 $ BS.unsafeDrop i bs) where i = unsafePerformIO $ BS.unsafeUseAsCString bs $ \ptr -> do let start = castPtr ptr :: S let end = go start pure $! Ptr end `minusPtr` start go s@(Ptr a) | f c = a | otherwise = go (next s) where c = char s head0 :: Str0 -> Char head0 (Str0 x) = Internal.w2c $ BS.unsafeHead x tail0 :: Str0 -> Str0 tail0 (Str0 x) = Str0 $ BS.unsafeTail x list0 :: Str0 -> (Char, Str0) list0 x = (head0 x, tail0 x) take0 :: Int -> Str0 -> Str take0 i (Str0 x) = BS.takeWhile (/= '\0') $ BS.take i x --------------------------------------------------------------------- -- ACTUAL LEXER -- Lex each line separately, rather than each lexeme data Lexeme = LexBind Str Expr -- [indent]foo = bar | LexBuild [Expr] Str [Expr] -- build foo: bar | baz || qux (| and || are represented as Expr) | LexInclude Expr -- include file | LexSubninja Expr -- include file | LexRule Str -- rule name | LexPool Str -- pool name | LexDefault [Expr] -- default foo bar | LexDefine Str Expr -- foo = bar deriving Show isVar, isVarDot :: Char -> Bool isVar x = x == '-' || x == '_' || isAsciiLower x || isAsciiUpper x || isDigit x isVarDot x = x == '.' || isVar x endsDollar :: Str -> Bool endsDollar = BS.isSuffixOf (BS.singleton '$') dropN :: Str0 -> Str0 dropN x = if head0 x == '\n' then tail0 x else x dropSpace :: Str0 -> Str0 dropSpace = dropWhile0 (== ' ') lexerFile :: Maybe FilePath -> IO [Lexeme] lexerFile file = lexer <$> maybe BS.getContents BS.readFile file lexer :: Str -> [Lexeme] lexer x = lexerLoop $ Str0 $ x `BS.append` BS.pack "\n\n\0" lexerLoop :: Str0 -> [Lexeme] lexerLoop c_x | (c,x) <- list0 c_x = case c of '\r' -> lexerLoop x '\n' -> lexerLoop x ' ' -> lexBind $ dropSpace x '#' -> lexerLoop $ dropWhile0 (/= '\n') x 'b' | Just x <- strip "uild " x -> lexBuild x 'r' | Just x <- strip "ule " x -> lexRule x 'd' | Just x <- strip "efault " x -> lexDefault x 'p' | Just x <- strip "ool " x -> lexPool x 'i' | Just x <- strip "nclude " x -> lexInclude x 's' | Just x <- strip "ubninja " x -> lexSubninja x '\0' -> [] _ -> lexDefine c_x where strip str (Str0 x) = if b `BS.isPrefixOf` x then Just $ dropSpace $ Str0 $ BS.drop (BS.length b) x else Nothing where b = BS.pack str lexBind :: Str0 -> [Lexeme] lexBind c_x | (c,x) <- list0 c_x = case c of '\r' -> lexerLoop x '\n' -> lexerLoop x '#' -> lexerLoop $ dropWhile0 (/= '\n') x '\0' -> [] _ -> lexxBind LexBind c_x lexBuild :: Str0 -> [Lexeme] lexBuild x | (outputs,x) <- lexxExprs True x , (rule,x) <- span0 isVarDot $ jumpCont $ dropSpace x , (deps,x) <- lexxExprs False $ dropSpace x = LexBuild outputs rule deps : lexerLoop x lexDefault :: Str0 -> [Lexeme] lexDefault x | (files,x) <- lexxExprs False x = LexDefault files : lexerLoop x lexRule, lexPool, lexInclude, lexSubninja, lexDefine :: Str0 -> [Lexeme] lexRule = lexxName LexRule lexPool = lexxName LexPool lexInclude = lexxFile LexInclude lexSubninja = lexxFile LexSubninja lexDefine = lexxBind LexDefine lexxBind :: (Str -> Expr -> Lexeme) -> Str0 -> [Lexeme] lexxBind ctor x | (var,x) <- span0 isVarDot x , ('=',x) <- list0 $ jumpCont $ dropSpace x , (exp,x) <- lexxExpr False False $ jumpCont $ dropSpace x = ctor var exp : lexerLoop x lexxBind _ x = error $ "Ninja parse failed when parsing binding, " ++ show (take0 100 x) lexxFile :: (Expr -> Lexeme) -> Str0 -> [Lexeme] lexxFile ctor x | (exp,rest) <- lexxExpr False False $ dropSpace x = ctor exp : lexerLoop rest lexxName :: (Str -> Lexeme) -> Str0 -> [Lexeme] lexxName ctor x | (name,rest) <- splitLineCont x = ctor name : lexerLoop rest lexxExprs :: Bool -> Str0 -> ([Expr], Str0) lexxExprs stopColon x = case lexxExpr stopColon True x of (a,c_x) | c <- head0 c_x, x <- tail0 c_x -> case c of ' ' -> add a $ lexxExprs stopColon $ dropSpace x ':' | stopColon -> new a x _ | stopColon -> error "Ninja parsing, expected a colon" '\r' -> new a $ dropN x '\n' -> new a x '\0' -> new a c_x _ -> error "Ninja parsing, unexpected expression" where new a x = add a ([], x) add (Exprs []) x = x add a (as,x) = (a:as,x) {-# NOINLINE lexxExpr #-} lexxExpr :: Bool -> Bool -> Str0 -> (Expr, Str0) -- snd will start with one of " :\n\r" or be empty lexxExpr stopColon stopSpace = first exprs . f where exprs [x] = x exprs xs = Exprs xs special = case (stopColon, stopSpace) of (True , True ) -> \x -> x <= ':' && (x == ':' || x == ' ' || x == '$' || x == '\r' || x == '\n' || x == '\0') (True , False) -> \x -> x <= ':' && (x == ':' || x == '$' || x == '\r' || x == '\n' || x == '\0') (False, True ) -> \x -> x <= '$' && ( x == ' ' || x == '$' || x == '\r' || x == '\n' || x == '\0') (False, False) -> \x -> x <= '$' && ( x == '$' || x == '\r' || x == '\n' || x == '\0') f x = case break00 special x of (a,x) -> if BS.null a then g x else Lit a $: g x x $: (xs,y) = (x:xs,y) g x | head0 x /= '$' = ([], x) g x | c_x <- tail0 x, (c,x) <- list0 c_x = case c of '$' -> Lit (BS.singleton '$') $: f x ' ' -> Lit (BS.singleton ' ') $: f x ':' -> Lit (BS.singleton ':') $: f x '\n' -> f $ dropSpace x '\r' -> f $ dropSpace $ dropN x '{' | (name,x) <- span0 isVarDot x, not $ BS.null name, ('}',x) <- list0 x -> Var name $: f x _ | (name,x) <- span0 isVar c_x, not $ BS.null name -> Var name $: f x _ -> error "Ninja parsing, unexpect $ followed by unexpected stuff" jumpCont :: Str0 -> Str0 jumpCont o | '$' <- head0 o , let x = tail0 o = case head0 x of '\n' -> dropSpace $ tail0 x '\r' -> dropSpace $ dropN $ tail0 x _ -> o | otherwise = o splitLineCont :: Str0 -> (Str, Str0) splitLineCont x = first BS.concat $ f x where f x = if not $ endsDollar a then ([a], b) else let (c,d) = f $ dropSpace b in (BS.init a : c, d) where (a,b) = splitLineCR x splitLineCR :: Str0 -> (Str, Str0) splitLineCR x = if BS.singleton '\r' `BS.isSuffixOf` a then (BS.init a, dropN b) else (a, dropN b) where (a,b) = break0 (== '\n') x shake-0.19.8/src/Development/Ninja/Parse.hs0000644000000000000000000000545307346545000016622 0ustar0000000000000000{-# LANGUAGE RecordWildCards, TupleSections #-} module Development.Ninja.Parse(parse) where import qualified Data.ByteString.Char8 as BS import Development.Ninja.Env import Development.Ninja.Type import Development.Ninja.Lexer import Control.Monad import General.Extra parse :: FilePath -> Env Str Str -> IO Ninja parse file env = parseFile file env newNinja parseFile :: FilePath -> Env Str Str -> Ninja -> IO Ninja parseFile file env ninja = do lexes <- lexerFile $ if file == "-" then Nothing else Just file foldM (applyStmt env) ninja{sources=file:sources ninja} $ withBinds lexes withBinds :: [Lexeme] -> [(Lexeme, [(Str,Expr)])] withBinds [] = [] withBinds (x:xs) = (x,a) : withBinds b where (a,b) = f xs f (LexBind a b : rest) = let (as,bs) = f rest in ((a,b):as, bs) f xs = ([], xs) applyStmt :: Env Str Str -> Ninja -> (Lexeme, [(Str,Expr)]) -> IO Ninja applyStmt env ninja@Ninja{..} (key, binds) = case key of LexBuild outputs rule deps -> do outputs <- mapM (askExpr env) outputs deps <- mapM (askExpr env) deps binds <- mapM (\(a,b) -> (a,) <$> askExpr env b) binds let (normal,implicit,orderOnly) = splitDeps deps let build = Build rule env normal implicit orderOnly binds pure $ if rule == BS.pack "phony" then ninja{phonys = [(x, normal ++ implicit ++ orderOnly) | x <- outputs] ++ phonys} else if length outputs == 1 then ninja{singles = (headErr outputs, build) : singles} else ninja{multiples = (outputs, build) : multiples} LexRule name -> pure ninja{rules = (name, Rule binds) : rules} LexDefault xs -> do xs <- mapM (askExpr env) xs pure ninja{defaults = xs ++ defaults} LexPool name -> do depth <- getDepth env binds pure ninja{pools = (name, depth) : pools} LexInclude expr -> do file <- askExpr env expr parseFile (BS.unpack file) env ninja LexSubninja expr -> do file <- askExpr env expr e <- scopeEnv env parseFile (BS.unpack file) e ninja LexDefine a b -> do addBind env a b pure ninja LexBind a _ -> error $ "Ninja parsing, unexpected binding defining " ++ BS.unpack a splitDeps :: [Str] -> ([Str], [Str], [Str]) splitDeps (x:xs) | x == BS.pack "|" = ([],a++b,c) | x == BS.pack "||" = ([],b,a++c) | otherwise = (x:a,b,c) where (a,b,c) = splitDeps xs splitDeps [] = ([], [], []) getDepth :: Env Str Str -> [(Str, Expr)] -> IO Int getDepth env xs = case lookup (BS.pack "depth") xs of Nothing -> pure 1 Just x -> do x <- askExpr env x case BS.readInt x of Just (i, n) | BS.null n -> pure i _ -> error $ "Ninja parsing, could not parse depth field in pool, got: " ++ BS.unpack x shake-0.19.8/src/Development/Ninja/Type.hs0000644000000000000000000000342407346545000016465 0ustar0000000000000000{-# LANGUAGE TupleSections #-} -- | The IO in this module is only to evaluate an environment variable, -- the 'Env' itself it passed around purely. module Development.Ninja.Type( Str, FileStr, Expr(..), Env, newEnv, askVar, askExpr, addEnv, addBind, addBinds, Ninja(..), newNinja, Build(..), Rule(..), ) where import Development.Ninja.Env import qualified Data.ByteString.Char8 as BS import Data.Maybe type Str = BS.ByteString type FileStr = Str --------------------------------------------------------------------- -- EXPRESSIONS AND BINDINGS data Expr = Exprs [Expr] | Lit Str | Var Str deriving (Show,Eq) askExpr :: Env Str Str -> Expr -> IO Str askExpr e = f where f (Exprs xs) = BS.concat <$> mapM f xs f (Lit x) = pure x f (Var x) = askVar e x askVar :: Env Str Str -> Str -> IO Str askVar e x = fromMaybe BS.empty <$> askEnv e x addBind :: Env Str Str -> Str -> Expr -> IO () addBind e k v = addEnv e k =<< askExpr e v addBinds :: Env Str Str -> [(Str, Expr)] -> IO () addBinds e bs = do bs <- mapM (\(a,b) -> (a,) <$> askExpr e b) bs mapM_ (uncurry $ addEnv e) bs --------------------------------------------------------------------- -- STRUCTURE data Ninja = Ninja {sources :: [FilePath] ,rules :: [(Str,Rule)] ,singles :: [(FileStr,Build)] ,multiples :: [([FileStr], Build)] ,phonys :: [(Str, [FileStr])] ,defaults :: [FileStr] ,pools :: [(Str, Int)] } deriving Show newNinja :: Ninja newNinja = Ninja [] [] [] [] [] [] [] data Build = Build {ruleName :: Str ,env :: Env Str Str ,depsNormal :: [FileStr] ,depsImplicit :: [FileStr] ,depsOrderOnly :: [FileStr] ,buildBind :: [(Str,Str)] } deriving Show newtype Rule = Rule {ruleBind :: [(Str,Expr)] } deriving Show shake-0.19.8/src/Development/0000755000000000000000000000000007346545000014146 5ustar0000000000000000shake-0.19.8/src/Development/Shake.hs0000644000000000000000000002663707346545000015553 0ustar0000000000000000{-# LANGUAGE TypeFamilies, TypeOperators, ConstraintKinds, PatternSynonyms #-} -- | This module is used for defining Shake build systems. As a simple example of a Shake build system, -- let us build the file @result.tar@ from the files listed by @result.txt@: -- -- @ -- import "Development.Shake" -- import "Development.Shake.FilePath" -- -- main = 'shakeArgs' 'shakeOptions' $ do -- 'want' [\"result.tar\"] -- \"*.tar\" '%>' \\out -> do -- contents \<- 'readFileLines' $ out 'Development.Shake.FilePath.-<.>' \"txt\" -- 'need' contents -- 'cmd' \"tar -cf\" [out] contents -- @ -- -- We start by importing the modules defining both Shake and routines for manipulating 'FilePath' values. -- We define @main@ to call 'shake' with the default 'shakeOptions'. As the second argument to -- 'shake', we provide a set of rules. There are two common forms of rules, 'want' to specify target files, -- and '%>' to define a rule which builds a 'FilePattern'. We use 'want' to require that after the build -- completes the file @result.tar@ should be ready. -- -- The @*.tar@ rule describes how to build files with the extension @.tar@, including @result.tar@. -- We 'readFileLines' on @result.txt@, after changing the @.tar@ extension to @.txt@. We read each line -- into the variable @contents@ -- being a list of the files that should go into @result.tar@. Next, we -- depend ('need') all the files in @contents@. If any of these files change, the rule will be repeated. -- Finally we call the @tar@ program. If either @result.txt@ changes, or any of the files listed by @result.txt@ -- change, then @result.tar@ will be rebuilt. -- -- To find out more: -- -- * The user manual contains a longer example and background information on how to use Shake -- . -- -- * The home page has links to additional information , including -- a mailing list. -- -- * The theory behind Shake is described in an ICFP 2012 paper, -- . -- The forms a short overview of Shake. module Development.Shake( -- * Writing a build system -- $writing -- * GHC build flags -- $flags -- * Other Shake modules -- $modules -- * Core shake, shakeOptions, Rules, action, withoutActions, alternatives, priority, versioned, Action, traced, liftIO, actionOnException, actionFinally, actionBracket, actionCatch, actionRetry, runAfter, ShakeException(..), -- * Configuration ShakeOptions(..), Rebuild(..), Lint(..), Change(..), getShakeOptions, getShakeOptionsRules, getHashedShakeVersion, getShakeExtra, getShakeExtraRules, addShakeExtra, -- ** Command line shakeArgs, shakeArgsWith, shakeArgsOptionsWith, shakeOptDescrs, addHelpSuffix, -- ** Targets getTargets, addTarget, withTargetDocs, withoutTargets, -- ** Progress reporting Progress(..), progressSimple, progressDisplay, progressTitlebar, progressProgram, getProgress, -- ** Verbosity Verbosity(..), getVerbosity, putVerbose, putInfo, putWarn, putError, withVerbosity, quietly, -- * Running commands command, command_, cmd, cmd_, unit, Stdout(..), StdoutTrim(..), Stderr(..), Stdouterr(..), Exit(..), Process(..), CmdTime(..), CmdLine(..), FSATrace(..), CmdResult, CmdString, CmdOption(..), addPath, addEnv, -- * Explicit parallelism parallel, forP, par, -- * Utility functions copyFile', copyFileChanged, readFile', readFileLines, writeFile', writeFileLines, writeFileChanged, removeFiles, removeFilesAfter, withTempFile, withTempDir, withTempFileWithin, withTempDirWithin, -- * File rules need, want, (%>), (|%>), (?>), phony, (~>), phonys, (&%>), (&?>), orderOnly, orderOnlyAction, FilePattern, (?==), (), filePattern, needed, trackRead, trackWrite, trackAllow, -- * Directory rules doesFileExist, doesDirectoryExist, getDirectoryContents, getDirectoryFiles, getDirectoryDirs, getDirectoryFilesIO, -- * Environment rules getEnv, getEnvWithDefault, getEnvError, -- * Oracle rules ShakeValue, RuleResult, addOracle, addOracleCache, addOracleHash, askOracle, askOracles, -- * Special rules alwaysRerun, -- * Resources Resource, newResource, newResourceIO, withResource, withResources, newThrottle, newThrottleIO, unsafeExtraThread, -- * Cache newCache, newCacheIO, historyDisable, produces, -- * Batching needHasChanged, resultHasChanged, batch, reschedule, -- * Deprecated askOracleWith, deprioritize, pattern Quiet, pattern Normal, pattern Loud, pattern Chatty, putLoud, putNormal, putQuiet ) where -- I would love to use module export in the above export list, but alas Haddock -- then shows all the things that are hidden in the docs, which is terrible. import Control.Monad.IO.Class import Development.Shake.Internal.Value import Development.Shake.Internal.Options import Development.Shake.Internal.Core.Types import Development.Shake.Internal.Core.Action import Development.Shake.Internal.Core.Rules import Development.Shake.Internal.Resource import Development.Shake.Internal.Derived import Development.Shake.Internal.Errors import Development.Shake.Internal.Progress import Development.Shake.Internal.Args import Development.Shake.Command import Development.Shake.Internal.FilePattern import Development.Shake.Internal.Rules.Directory import Development.Shake.Internal.Rules.File import Development.Shake.Internal.Rules.Files import Development.Shake.Internal.Rules.Oracle import Development.Shake.Internal.Rules.OrderOnly import Development.Shake.Internal.Rules.Rerun -- $writing -- -- When writing a Shake build system, start by defining what you 'want', then write rules -- with '%>' to produce the results. Before calling 'cmd' you should ensure that any files the command -- requires are demanded with calls to 'need'. We offer the following advice to Shake users: -- -- * If @ghc --make@ or @cabal@ is capable of building your project, use that instead. Custom build systems are -- necessary for many complex projects, but many projects are not complex. -- -- * The 'shakeArgs' function automatically handles command line arguments. To define non-file targets use 'phony'. -- -- * Put all result files in a distinguished directory, for example @_make@. You can implement a @clean@ -- command by removing that directory, using @'removeFilesAfter' \"_make\" [\"\/\/\*\"]@. -- -- * To obtain parallel builds set 'shakeThreads' to a number greater than 1. -- -- * Lots of compilers produce @.o@ files. To avoid overlapping rules, use @.c.o@ for C compilers, -- @.hs.o@ for Haskell compilers etc. -- -- * Do not be afraid to mix Shake rules, system commands and other Haskell libraries -- use each for what -- it does best. -- -- * The more accurate the dependencies are, the better. Use additional rules like 'doesFileExist' and -- 'getDirectoryFiles' to track information other than just the contents of files. For information in the environment -- that you suspect will change regularly (perhaps @ghc@ version number), either write the information to -- a file with 'alwaysRerun' and 'writeFileChanged', or use 'addOracle'. -- $flags -- -- For large build systems the choice of GHC flags can have a significant impact. We recommend: -- -- > ghc --make MyBuildSystem -threaded -rtsopts "-with-rtsopts=-I0 -qg" -- -- * @-rtsopts@: Allow the setting of further GHC options at runtime. -- -- * @-I0@: Disable idle garbage collection, to avoid frequent unnecessary garbage collection, see -- . -- -- * You may add @-threaded@, and pass the options @-qg@ to @-with-rtsopts@ -- to disable parallel garbage collection. Parallel garbage collection in Shake -- programs typically goes slower than sequential garbage collection, while occupying many cores that -- could be used for running system commands. -- $modules -- -- The main Shake module is this one, "Development.Shake", which should be sufficient for most -- people writing build systems using Shake. However, Shake provides some additional modules, -- -- * "Development.Shake.Classes" provides convenience exports of the classes Shake relies on, -- in particular 'Binary', 'Hashable' and 'NFData'. Useful for deriving these types using -- @GeneralizedNewtypeDeriving@ without adding dependencies on the associated packages. -- -- * "Development.Shake.Command" provides the command line wrappers. These are reexported by -- "Development.Shake", but if you want to reuse just the command-line running functionality -- in a non-Shake program you can import just that. -- -- * "Development.Shake.Config" provides a way to write configuration files that are tracked. -- The configuration files are in the Ninja format. Useful for users of bigger systems who -- want to track the build rules not in Haskell. -- -- * "Development.Shake.Database" provides lower level primitives to drive Shake, particularly -- useful if you want to run multiple Shake runs in a row without reloading from the database. -- -- * "Development.Shake.FilePath" is an extension of "System.FilePath" with a few additional -- methods and safer extension manipulation code. -- -- * "Development.Shake.Forward" is an alternative take on build systems, where you write the -- rules as a script where steps are skipped, rather than as a set of dependencies. Only really -- works if you use @fsatrace@. -- -- * "Development.Shake.Rule" provides tools for writing your own types of Shake rules. Useful -- if you need something new, like a rule that queries a database or similar. -- -- * "Development.Shake.Util" has general utilities that are useful for build systems, e.g. -- reading @Makefile@ syntax and alternative forms of argument parsing. --------------------------------------------------------------------- -- DEPRECATED SINCE 0.16.1, NOV 2017 -- | /Deprecated:/ Replace @'askOracleWith' q a@ by @'askOracle' q@ -- since the 'RuleResult' type family now fixes the result type. {-# DEPRECATED askOracleWith "Use 'askOracle q' instead of 'askOracleWith q a', the result value is now unnecessary" #-} askOracleWith :: (RuleResult q ~ a, ShakeValue q, ShakeValue a) => q -> a -> Action a askOracleWith question _ = askOracle question --------------------------------------------------------------------- -- DEPRECATED SINCE 0.18.4, JUL 2019 -- | /Deprecated:/ Alias for 'reschedule'. {-# DEPRECATED deprioritize "Use 'reschedule' instead" #-} deprioritize :: Double -> Action () deprioritize = reschedule -- | /Deprecated:/ A bidirectional pattern synonym for 'Error'. pattern Quiet :: Verbosity pattern Quiet = Error -- | /Deprecated:/ A bidirectional pattern synonym for 'Info'. pattern Normal :: Verbosity pattern Normal = Info -- | /Deprecated:/ A bidirectional pattern synonym for 'Verbose'. pattern Loud :: Verbosity pattern Loud = Verbose -- | /Deprecated:/ A bidirectional pattern synonym for 'Verbose'. pattern Chatty :: Verbosity pattern Chatty = Verbose putLoud, putNormal, putQuiet :: String -> Action () -- | /Deprecated:/ Alias for 'putVerbose'. putLoud = putVerbose -- | /Deprecated:/ Alias for 'putInfo'. putNormal = putInfo -- | /Deprecated:/ Alias for 'putError'. putQuiet = putError shake-0.19.8/src/Development/Shake/0000755000000000000000000000000007346545000015201 5ustar0000000000000000shake-0.19.8/src/Development/Shake/Classes.hs0000644000000000000000000000107607346545000017136 0ustar0000000000000000 -- | This module reexports the six necessary type classes that many rule types must support through 'ShakeValue'. -- You can use this module to define new rules without depending on the @binary@, @deepseq@ and @hashable@ packages. module Development.Shake.Classes( Show(..), Typeable, Eq(..), Hashable(..), Binary(..), NFData(..) ) where -- I would probably reexport this module by default in Development.Shake, -- but Binary defines 'get', which clashes with the State monad. import Data.Hashable import Data.Typeable import Data.Binary import Control.DeepSeq shake-0.19.8/src/Development/Shake/Command.hs0000644000000000000000000010545307346545000017123 0ustar0000000000000000{-# LANGUAGE LambdaCase #-} {-# LANGUAGE DeriveFunctor #-} {-# LANGUAGE FlexibleInstances, TypeOperators, ScopedTypeVariables, NamedFieldPuns #-} {-# LANGUAGE GADTs, GeneralizedNewtypeDeriving, DeriveDataTypeable, RecordWildCards #-} -- | This module provides functions for calling command line programs, primarily -- 'command' and 'cmd'. As a simple example: -- -- @ -- 'command' [] \"gcc\" [\"-c\",myfile] -- @ -- -- The functions from this module are now available directly from "Development.Shake". -- You should only need to import this module if you are using the 'cmd' function in the 'IO' monad. module Development.Shake.Command( command, command_, cmd, cmd_, unit, CmdArgument(..), CmdArguments(..), IsCmdArgument(..), (:->), Stdout(..), StdoutTrim(..), Stderr(..), Stdouterr(..), Exit(..), Process(..), CmdTime(..), CmdLine(..), FSATrace(..), CmdResult, CmdString, CmdOption(..), addPath, addEnv, ) where import Data.Tuple.Extra import Control.Monad.Extra import Control.Monad.IO.Class import Control.Exception.Extra import Data.Char import Data.Either.Extra import Data.Foldable (toList) import Data.List.Extra import Data.List.NonEmpty (NonEmpty) import qualified Data.HashSet as Set import Data.Maybe import Data.Data import Data.Semigroup import System.Directory import qualified System.IO.Extra as IO import System.Environment import System.Exit import System.IO.Extra hiding (withTempFile, withTempDir) import System.Process import System.Info.Extra import System.Time.Extra import System.IO.Unsafe (unsafeInterleaveIO) import qualified Data.ByteString.Char8 as BS import qualified Data.ByteString.Lazy.Char8 as LBS import qualified Data.ByteString.UTF8 as UTF8 import General.Extra import General.Process import Prelude import Development.Shake.Internal.CmdOption import Development.Shake.Internal.Core.Action import Development.Shake.Internal.Core.Types hiding (Result) import Development.Shake.FilePath import Development.Shake.Internal.Options import Development.Shake.Internal.Rules.File import Development.Shake.Internal.Derived --------------------------------------------------------------------- -- ACTUAL EXECUTION -- | /Deprecated:/ Use 'AddPath'. This function will be removed in a future version. -- -- Add a prefix and suffix to the @$PATH@ environment variable. For example: -- -- @ -- opt <- 'addPath' [\"\/usr\/special\"] [] -- 'cmd' opt \"userbinary --version\" -- @ -- -- Would prepend @\/usr\/special@ to the current @$PATH@, and the command would pick -- @\/usr\/special\/userbinary@, if it exists. To add other variables see 'addEnv'. addPath :: MonadIO m => [String] -> [String] -> m CmdOption addPath pre post = do args <- liftIO getEnvironment let (path,other) = partition ((== "PATH") . (if isWindows then upper else id) . fst) args pure $ Env $ [("PATH",intercalate [searchPathSeparator] $ pre ++ post) | null path] ++ [(a,intercalate [searchPathSeparator] $ pre ++ [b | b /= ""] ++ post) | (a,b) <- path] ++ other -- | /Deprecated:/ Use 'AddEnv'. This function will be removed in a future version. -- -- Add a single variable to the environment. For example: -- -- @ -- opt <- 'addEnv' [(\"CFLAGS\",\"-O2\")] -- 'cmd' opt \"gcc -c main.c\" -- @ -- -- Would add the environment variable @$CFLAGS@ with value @-O2@. If the variable @$CFLAGS@ -- was already defined it would be overwritten. If you wish to modify @$PATH@ see 'addPath'. addEnv :: MonadIO m => [(String, String)] -> m CmdOption addEnv extra = do args <- liftIO getEnvironment pure $ Env $ extra ++ filter (\(a,_) -> a `notElem` map fst extra) args data Str = Str String | BS BS.ByteString | LBS LBS.ByteString | Unit deriving (Eq,Show) strTrim :: Str -> Str strTrim (Str x) = Str $ trim x strTrim (BS x) = BS $ fst $ BS.spanEnd isSpace $ BS.dropWhile isSpace x strTrim (LBS x) = LBS $ trimEnd $ LBS.dropWhile isSpace x where trimEnd x = case LBS.uncons x of Just (c, x2) | isSpace c -> trimEnd x2 _ -> x strTrim Unit = Unit data Result = ResultStdout Str | ResultStderr Str | ResultStdouterr Str | ResultCode ExitCode | ResultTime Double | ResultLine String | ResultProcess PID | ResultFSATrace [FSATrace FilePath] | ResultFSATraceBS [FSATrace BS.ByteString] deriving (Eq,Show) data PID = PID0 | PID ProcessHandle instance Eq PID where _ == _ = True instance Show PID where show PID0 = "PID0"; show _ = "PID" data Params = Params {funcName :: String ,opts :: [CmdOption] ,results :: [Result] ,prog :: String ,args :: [String] } deriving Show class MonadIO m => MonadTempDir m where runWithTempDir :: (FilePath -> m a) -> m a runWithTempFile :: (FilePath -> m a) -> m a instance MonadTempDir IO where runWithTempDir = IO.withTempDir runWithTempFile = IO.withTempFile instance MonadTempDir Action where runWithTempDir = withTempDir runWithTempFile = withTempFile --------------------------------------------------------------------- -- DEAL WITH Shell removeOptionShell :: MonadTempDir m => Params -- ^ Given the parameter -> (Params -> m a) -- ^ Call with the revised params, program name and command line -> m a removeOptionShell params@Params{..} call | Shell `elem` opts = do -- put our UserCommand first, as the last one wins, and ours is lowest priority let userCmdline = unwords $ prog : args params <- pure params{opts = UserCommand userCmdline : filter (/= Shell) opts} prog <- liftIO $ if isFSATrace params then copyFSABinary prog else pure prog let realCmdline = unwords $ prog : args if not isWindows then call params{prog = "/bin/sh", args = ["-c",realCmdline]} else -- On Windows the Haskell behaviour isn't that clean and is very fragile, so we try and do better. runWithTempDir $ \dir -> do let file = dir "s.bat" writeFile' file realCmdline call params{prog = "cmd.exe", args = ["/d/q/c",file]} | otherwise = call params --------------------------------------------------------------------- -- DEAL WITH FSATrace isFSATrace :: Params -> Bool isFSATrace Params{..} = any isResultFSATrace results || any isFSAOptions opts -- Mac disables tracing on system binaries, so we copy them over, yurk copyFSABinary :: FilePath -> IO FilePath copyFSABinary prog | not isMac = pure prog | otherwise = do progFull <- findExecutable prog case progFull of Just x | any (`isPrefixOf` x) ["/bin/","/usr/","/sbin/"] -> do -- The file is one of the ones we can't trace, so we make a copy of it in $TMP and run that -- We deliberately don't clean up this directory, since otherwise we spend all our time copying binaries over tmpdir <- getTemporaryDirectory let fake = tmpdir "fsatrace-fakes" ++ x -- x is absolute, so must use ++ unlessM (doesFileExist fake) $ do createDirectoryRecursive $ takeDirectory fake copyFile x fake pure fake _ -> pure prog removeOptionFSATrace :: MonadTempDir m => Params -- ^ Given the parameter -> (Params -> m [Result]) -- ^ Call with the revised params, program name and command line -> m [Result] removeOptionFSATrace params@Params{..} call | not $ isFSATrace params = call params | ResultProcess PID0 `elem` results = -- This is a bad state to get into, you could technically just ignore the tracing, but that's a bit dangerous liftIO $ errorIO "Asyncronous process execution combined with FSATrace is not support" | otherwise = runWithTempFile $ \file -> do liftIO $ writeFile file "" -- ensures even if we fail before fsatrace opens the file, we can still read it params <- liftIO $ fsaParams file params res <- call params{opts = UserCommand (showCommandForUser2 prog args) : filter (not . isFSAOptions) opts} fsaResBS <- liftIO $ parseFSA <$> BS.readFile file let fsaRes = map (fmap UTF8.toString) fsaResBS pure $ flip map res $ \case ResultFSATrace [] -> ResultFSATrace fsaRes ResultFSATraceBS [] -> ResultFSATraceBS fsaResBS x -> x where fsaFlags = lastDef "rwmdqt" [x | FSAOptions x <- opts] fsaParams file Params{..} = do prog <- copyFSABinary prog pure params{prog = "fsatrace", args = fsaFlags : file : "--" : prog : args } isFSAOptions FSAOptions{} = True isFSAOptions _ = False isResultFSATrace ResultFSATrace{} = True isResultFSATrace ResultFSATraceBS{} = True isResultFSATrace _ = False addFSAOptions :: String -> [CmdOption] -> [CmdOption] addFSAOptions x opts | any isFSAOptions opts = map f opts where f (FSAOptions y) = FSAOptions $ nubOrd $ y ++ x f x = x addFSAOptions x opts = FSAOptions x : opts -- | The results produced by @fsatrace@. All files will be absolute paths. -- You can get the results for a 'cmd' by requesting a value of type -- @['FSATrace']@. data FSATrace a = -- | Writing to a file FSAWrite a | -- | Reading from a file FSARead a | -- | Deleting a file FSADelete a | -- | Moving, arguments destination, then source FSAMove a a | -- | Querying\/stat on a file FSAQuery a | -- | Touching a file FSATouch a deriving (Show,Eq,Ord,Data,Typeable,Functor) -- | Parse the 'FSATrace' entries, ignoring anything you don't understand. parseFSA :: BS.ByteString -> [FSATrace BS.ByteString] parseFSA = mapMaybe (f . dropR) . BS.lines where -- deal with CRLF on Windows dropR x = case BS.unsnoc x of Just (x, '\r') -> x _ -> x f x | Just (k, x) <- BS.uncons x , Just ('|', x) <- BS.uncons x = case k of 'w' -> Just $ FSAWrite x 'r' -> Just $ FSARead x 'd' -> Just $ FSADelete x 'm' | (xs, ys) <- BS.break (== '|') x, Just ('|',ys) <- BS.uncons ys -> Just $ FSAMove xs ys 'q' -> Just $ FSAQuery x 't' -> Just $ FSATouch x _ -> Nothing | otherwise = Nothing --------------------------------------------------------------------- -- ACTION EXPLICIT OPERATION -- | Given explicit operations, apply the Action ones, like skip/trace/track/autodep commandExplicitAction :: Partial => Params -> Action [Result] commandExplicitAction oparams = do ShakeOptions{shakeCommandOptions,shakeRunCommands,shakeLint,shakeLintInside} <- getShakeOptions params@Params{..}<- pure $ oparams{opts = shakeCommandOptions ++ opts oparams} let skipper act = if null results && not shakeRunCommands then pure [] else act let verboser act = do let cwd = listToMaybe $ reverse [x | Cwd x <- opts] putVerbose $ maybe "" (\x -> "cd " ++ x ++ "; ") cwd ++ lastDef (showCommandForUser2 prog args) [x | UserCommand x <- opts] verb <- getVerbosity -- run quietly to suppress the tracer (don't want to print twice) (if verb >= Verbose then quietly else id) act let tracer act = do -- note: use the oparams - find a good tracing before munging it for shell stuff let msg = lastDef (defaultTraced oparams) [x | Traced x <- opts] if msg == "" then liftIO act else traced msg act let async = ResultProcess PID0 `elem` results let tracker act | AutoDeps `elem` opts = if async then liftIO $ errorIO "Can't use AutoDeps and asyncronous execution" else autodeps act | shakeLint == Just LintFSATrace && not async = fsalint act | otherwise = act params autodeps act = do ResultFSATrace pxs : res <- act params{opts = addFSAOptions "rwm" opts, results = ResultFSATrace [] : results} let written = Set.fromList $ [x | FSAMove x _ <- pxs] ++ [x | FSAWrite x <- pxs] -- If something both reads and writes to a file, it isn't eligible to be an autodeps xs <- liftIO $ filterM doesFileExist [x | FSARead x <- pxs, not $ x `Set.member` written] cwd <- liftIO getCurrentDirectory temp <- fixPaths cwd xs unsafeAllowApply $ need temp pure res fixPaths cwd xs = liftIO $ do xs<- pure $ map toStandard xs xs<- pure $ filter (\x -> any (`isPrefixOf` x) shakeLintInside) xs mapM (\x -> fromMaybe x <$> makeRelativeEx cwd x) xs fsalint act = do ResultFSATrace xs : res <- act params{opts = addFSAOptions "rwm" opts, results = ResultFSATrace [] : results} let reader (FSARead x) = Just x; reader _ = Nothing writer (FSAWrite x) = Just x; writer (FSAMove x _) = Just x; writer _ = Nothing existing f = liftIO . filterM doesFileExist . nubOrd . mapMaybe f cwd <- liftIO getCurrentDirectory trackRead =<< fixPaths cwd =<< existing reader xs trackWrite =<< fixPaths cwd =<< existing writer xs pure res skipper $ tracker $ \params -> verboser $ tracer $ commandExplicitIO params defaultTraced :: Params -> String defaultTraced Params{..} = takeBaseName $ if Shell `elem` opts then fst (word1 prog) else prog --------------------------------------------------------------------- -- IO EXPLICIT OPERATION -- | Given a very explicit set of CmdOption, translate them to a General.Process structure commandExplicitIO :: Partial => Params -> IO [Result] commandExplicitIO params = removeOptionShell params $ \params -> removeOptionFSATrace params $ \Params{..} -> do let (grabStdout, grabStderr) = both or $ unzip $ flip map results $ \case ResultStdout{} -> (True, False) ResultStderr{} -> (False, True) ResultStdouterr{} -> (True, True) _ -> (False, False) optEnv <- resolveEnv opts let optCwd = mergeCwd [x | Cwd x <- opts] let optStdin = flip mapMaybe opts $ \case Stdin x -> Just $ SrcString x StdinBS x -> Just $ SrcBytes x FileStdin x -> Just $ SrcFile x InheritStdin -> Just SrcInherit _ -> Nothing let optBinary = BinaryPipes `elem` opts let optAsync = ResultProcess PID0 `elem` results let optTimeout = listToMaybe $ reverse [x | Timeout x <- opts] let optWithStdout = lastDef False [x | WithStdout x <- opts] let optWithStderr = lastDef True [x | WithStderr x <- opts] let optFileStdout = [x | FileStdout x <- opts] let optFileStderr = [x | FileStderr x <- opts] let optEchoStdout = lastDef (not grabStdout && null optFileStdout) [x | EchoStdout x <- opts] let optEchoStderr = lastDef (not grabStderr && null optFileStderr) [x | EchoStderr x <- opts] let optRealCommand = showCommandForUser2 prog args let optUserCommand = lastDef optRealCommand [x | UserCommand x <- opts] let optCloseFds = CloseFileHandles `elem` opts let optProcessGroup = NoProcessGroup `notElem` opts let bufLBS f = do (a,b) <- buf $ LBS LBS.empty; pure (a, (\(LBS x) -> f x) <$> b) buf Str{} | optBinary = bufLBS (Str . LBS.unpack) buf Str{} = do x <- newBuffer; pure ([DestString x | not optAsync], Str . concat <$> readBuffer x) buf LBS{} = do x <- newBuffer; pure ([DestBytes x | not optAsync], LBS . LBS.fromChunks <$> readBuffer x) buf BS {} = bufLBS (BS . BS.concat . LBS.toChunks) buf Unit = pure ([], pure Unit) (dStdout, dStderr, resultBuild) :: ([[Destination]], [[Destination]], [Double -> ProcessHandle -> ExitCode -> IO Result]) <- fmap unzip3 $ forM results $ \case ResultCode _ -> pure ([], [], \_ _ ex -> pure $ ResultCode ex) ResultTime _ -> pure ([], [], \dur _ _ -> pure $ ResultTime dur) ResultLine _ -> pure ([], [], \_ _ _ -> pure $ ResultLine optUserCommand) ResultProcess _ -> pure ([], [], \_ pid _ -> pure $ ResultProcess $ PID pid) ResultStdout s -> do (a,b) <- buf s; pure (a , [], \_ _ _ -> fmap ResultStdout b) ResultStderr s -> do (a,b) <- buf s; pure ([], a , \_ _ _ -> fmap ResultStderr b) ResultStdouterr s -> do (a,b) <- buf s; pure (a , a , \_ _ _ -> fmap ResultStdouterr b) ResultFSATrace _ -> pure ([], [], \_ _ _ -> pure $ ResultFSATrace []) -- filled in elsewhere ResultFSATraceBS _ -> pure ([], [], \_ _ _ -> pure $ ResultFSATraceBS []) -- filled in elsewhere exceptionBuffer <- newBuffer po <- resolvePath ProcessOpts {poCommand = RawCommand prog args ,poCwd = optCwd, poEnv = optEnv, poTimeout = optTimeout ,poStdin = [SrcBytes LBS.empty | optBinary && not (null optStdin)] ++ optStdin ,poStdout = [DestEcho | optEchoStdout] ++ map DestFile optFileStdout ++ [DestString exceptionBuffer | optWithStdout && not optAsync] ++ concat dStdout ,poStderr = [DestEcho | optEchoStderr] ++ map DestFile optFileStderr ++ [DestString exceptionBuffer | optWithStderr && not optAsync] ++ concat dStderr ,poAsync = optAsync ,poCloseFds = optCloseFds ,poGroup = optProcessGroup } (dur,(pid,exit)) <- duration $ process po if exit == ExitSuccess || ResultCode ExitSuccess `elem` results then mapM (\f -> f dur pid exit) resultBuild else do exceptionBuffer <- readBuffer exceptionBuffer let captured = ["Stderr" | optWithStderr] ++ ["Stdout" | optWithStdout] cwd <- case optCwd of Nothing -> pure "" Just v -> do v <- canonicalizePath v `catchIO` const (pure v) pure $ "Current directory: " ++ v ++ "\n" liftIO $ errorIO $ "Development.Shake." ++ funcName ++ ", system command failed\n" ++ "Command line: " ++ optRealCommand ++ "\n" ++ (if optRealCommand /= optUserCommand then "Original command line: " ++ optUserCommand ++ "\n" else "") ++ cwd ++ "Exit code: " ++ show (case exit of ExitFailure i -> i; _ -> 0) ++ "\n" ++ if null captured then "Stderr not captured because WithStderr False was used\n" else if null exceptionBuffer then intercalate " and " captured ++ " " ++ (if length captured == 1 then "was" else "were") ++ " empty" else intercalate " and " captured ++ ":\n" ++ unlines (dropWhile null $ lines $ concat exceptionBuffer) mergeCwd :: [FilePath] -> Maybe FilePath mergeCwd [] = Nothing mergeCwd xs = Just $ foldl1 () xs -- | Apply all environment operations, to produce a new environment to use. resolveEnv :: [CmdOption] -> IO (Maybe [(String, String)]) resolveEnv opts | null env, null addEnv, null addPath, null remEnv = pure Nothing | otherwise = Just . unique . tweakPath . (++ addEnv) . filter (flip notElem remEnv . fst) <$> if null env then getEnvironment else pure (concat env) where env = [x | Env x <- opts] addEnv = [(x,y) | AddEnv x y <- opts] remEnv = [x | RemEnv x <- opts] addPath = [(x,y) | AddPath x y <- opts] newPath mid = intercalate [searchPathSeparator] $ concat (reverse $ map fst addPath) ++ [mid | mid /= ""] ++ concatMap snd addPath isPath x = (if isWindows then upper else id) x == "PATH" tweakPath xs | not $ any (isPath . fst) xs = ("PATH", newPath "") : xs | otherwise = map (\(a,b) -> (a, if isPath a then newPath b else b)) xs unique = reverse . nubOrdOn (if isWindows then upper . fst else fst) . reverse -- | If the user specifies a custom $PATH, and not Shell, then try and resolve their prog ourselves. -- Tricky, because on Windows it doesn't look in the $PATH first. resolvePath :: ProcessOpts -> IO ProcessOpts resolvePath po | Just e <- poEnv po , Just (_, path) <- find ((==) "PATH" . (if isWindows then upper else id) . fst) e , RawCommand prog args <- poCommand po = do let progExe = if prog == prog -<.> exe then prog else prog <.> exe -- use unsafeInterleaveIO to allow laziness to skip the queries we don't use pathOld <- unsafeInterleaveIO $ fromMaybe "" <$> lookupEnv "PATH" old <- unsafeInterleaveIO $ findExecutable prog new <- unsafeInterleaveIO $ findExecutableWith (splitSearchPath path) progExe old2 <- unsafeInterleaveIO $ findExecutableWith (splitSearchPath pathOld) progExe switch<- pure $ case () of _ | path == pathOld -> False -- The state I can see hasn't changed | Nothing <- new -> False -- I have nothing to offer | Nothing <- old -> True -- I failed last time, so this must be an improvement | Just old <- old, Just new <- new, equalFilePath old new -> False -- no different | Just old <- old, Just old2 <- old2, equalFilePath old old2 -> True -- I could predict last time | otherwise -> False pure $ case new of Just new | switch -> po{poCommand = RawCommand new args} _ -> po resolvePath po = pure po -- | Given a list of directories, and a file name, return the complete path if you can find it. -- Like findExecutable, but with a custom PATH. findExecutableWith :: [FilePath] -> String -> IO (Maybe FilePath) findExecutableWith path x = flip firstJustM (map ( x) path) $ \s -> ifM (doesFileExist s) (pure $ Just s) (pure Nothing) --------------------------------------------------------------------- -- FIXED ARGUMENT WRAPPER -- | Collect the @stdout@ of the process. -- If used, the @stdout@ will not be echoed to the terminal, unless you include 'EchoStdout'. -- The value type may be either 'String', or either lazy or strict 'ByteString'. -- -- Note that most programs end their output with a trailing newline, so calling -- @ghc --numeric-version@ will result in 'Stdout' of @\"6.8.3\\n\"@. If you want to automatically -- trim the resulting string, see 'StdoutTrim'. newtype Stdout a = Stdout {fromStdout :: a} -- | Like 'Stdout' but remove all leading and trailing whitespaces. newtype StdoutTrim a = StdoutTrim {fromStdoutTrim :: a} -- | Collect the @stderr@ of the process. -- If used, the @stderr@ will not be echoed to the terminal, unless you include 'EchoStderr'. -- The value type may be either 'String', or either lazy or strict 'ByteString'. newtype Stderr a = Stderr {fromStderr :: a} -- | Collect the @stdout@ and @stderr@ of the process. -- If used, the @stderr@ and @stdout@ will not be echoed to the terminal, unless you include 'EchoStdout' and 'EchoStderr'. -- The value type may be either 'String', or either lazy or strict 'ByteString'. newtype Stdouterr a = Stdouterr {fromStdouterr :: a} -- | Collect the 'ExitCode' of the process. -- If you do not collect the exit code, any 'ExitFailure' will cause an exception. newtype Exit = Exit {fromExit :: ExitCode} -- | Collect the 'ProcessHandle' of the process. -- If you do collect the process handle, the command will run asyncronously and the call to 'cmd' \/ 'command' -- will return as soon as the process is spawned. Any 'Stdout' \/ 'Stderr' captures will return empty strings. newtype Process = Process {fromProcess :: ProcessHandle} -- | Collect the time taken to execute the process. Can be used in conjunction with 'CmdLine' to -- write helper functions that print out the time of a result. -- -- @ -- timer :: ('CmdResult' r, MonadIO m) => (forall r . 'CmdResult' r => m r) -> m r -- timer act = do -- ('CmdTime' t, 'CmdLine' x, r) <- act -- liftIO $ putStrLn $ \"Command \" ++ x ++ \" took \" ++ show t ++ \" seconds\" -- pure r -- -- run :: IO () -- run = timer $ 'cmd' \"ghc --version\" -- @ newtype CmdTime = CmdTime {fromCmdTime :: Double} -- | Collect the command line used for the process. This command line will be approximate - -- suitable for user diagnostics, but not for direct execution. newtype CmdLine = CmdLine {fromCmdLine :: String} -- | The allowable 'String'-like values that can be captured. class CmdString a where cmdString :: (Str, Str -> a) instance CmdString () where cmdString = (Unit, \Unit -> ()) instance CmdString String where cmdString = (Str "", \(Str x) -> x) instance CmdString BS.ByteString where cmdString = (BS BS.empty, \(BS x) -> x) instance CmdString LBS.ByteString where cmdString = (LBS LBS.empty, \(LBS x) -> x) class Unit a instance {-# OVERLAPPING #-} Unit b => Unit (a -> b) instance {-# OVERLAPPABLE #-} a ~ () => Unit (m a) -- | A class for specifying what results you want to collect from a process. -- Values are formed of 'Stdout', 'Stderr', 'Exit' and tuples of those. class CmdResult a where -- Return a list of results (with the right type but dummy data) -- and a function to transform a populated set of results into a value cmdResult :: ([Result], [Result] -> a) instance CmdResult Exit where cmdResult = ([ResultCode ExitSuccess], \[ResultCode x] -> Exit x) instance CmdResult ExitCode where cmdResult = ([ResultCode ExitSuccess], \[ResultCode x] -> x) instance CmdResult Process where cmdResult = ([ResultProcess PID0], \[ResultProcess (PID x)] -> Process x) instance CmdResult ProcessHandle where cmdResult = ([ResultProcess PID0], \[ResultProcess (PID x)] -> x) instance CmdResult CmdLine where cmdResult = ([ResultLine ""], \[ResultLine x] -> CmdLine x) instance CmdResult CmdTime where cmdResult = ([ResultTime 0], \[ResultTime x] -> CmdTime x) instance CmdResult [FSATrace FilePath] where cmdResult = ([ResultFSATrace []], \[ResultFSATrace x] -> x) instance CmdResult [FSATrace BS.ByteString] where cmdResult = ([ResultFSATraceBS []], \[ResultFSATraceBS x] -> x) instance CmdString a => CmdResult (Stdout a) where cmdResult = let (a,b) = cmdString in ([ResultStdout a], \[ResultStdout x] -> Stdout $ b x) instance CmdString a => CmdResult (StdoutTrim a) where cmdResult = let (a,b) = cmdString in ([ResultStdout a], \[ResultStdout x] -> StdoutTrim $ b $ strTrim x) instance CmdString a => CmdResult (Stderr a) where cmdResult = let (a,b) = cmdString in ([ResultStderr a], \[ResultStderr x] -> Stderr $ b x) instance CmdString a => CmdResult (Stdouterr a) where cmdResult = let (a,b) = cmdString in ([ResultStdouterr a], \[ResultStdouterr x] -> Stdouterr $ b x) instance CmdResult () where cmdResult = ([], \[] -> ()) instance (CmdResult x1, CmdResult x2) => CmdResult (x1,x2) where cmdResult = (a1++a2, \rs -> let (r1,r2) = splitAt (length a1) rs in (b1 r1, b2 r2)) where (a1,b1) = cmdResult (a2,b2) = cmdResult cmdResultWith :: forall b c. CmdResult b => (b -> c) -> ([Result], [Result] -> c) cmdResultWith f = second (f .) cmdResult instance (CmdResult x1, CmdResult x2, CmdResult x3) => CmdResult (x1,x2,x3) where cmdResult = cmdResultWith $ \(a,(b,c)) -> (a,b,c) instance (CmdResult x1, CmdResult x2, CmdResult x3, CmdResult x4) => CmdResult (x1,x2,x3,x4) where cmdResult = cmdResultWith $ \(a,(b,c,d)) -> (a,b,c,d) instance (CmdResult x1, CmdResult x2, CmdResult x3, CmdResult x4, CmdResult x5) => CmdResult (x1,x2,x3,x4,x5) where cmdResult = cmdResultWith $ \(a,(b,c,d,e)) -> (a,b,c,d,e) -- | Execute a system command. Before running 'command' make sure you 'Development.Shake.need' any files -- that are used by the command. -- -- This function takes a list of options (often just @[]@, see 'CmdOption' for the available -- options), the name of the executable (either a full name, or a program on the @$PATH@) and -- a list of arguments. The result is often @()@, but can be a tuple containg any of 'Stdout', -- 'Stderr' and 'Exit'. Some examples: -- -- @ -- 'command_' [] \"gcc\" [\"-c\",\"myfile.c\"] -- compile a file, throwing an exception on failure -- 'Exit' c <- 'command' [] \"gcc\" [\"-c\",myfile] -- run a command, recording the exit code -- ('Exit' c, 'Stderr' err) <- 'command' [] \"gcc\" [\"-c\",\"myfile.c\"] -- run a command, recording the exit code and error output -- 'Stdout' out <- 'command' [] \"gcc\" [\"-MM\",\"myfile.c\"] -- run a command, recording the output -- 'command_' ['Cwd' \"generated\"] \"gcc\" [\"-c\",myfile] -- run a command in a directory -- @ -- -- Unless you retrieve the 'ExitCode' using 'Exit', any 'ExitFailure' will throw an error, including -- the 'Stderr' in the exception message. If you capture the 'Stdout' or 'Stderr', that stream will not be echoed to the console, -- unless you use the option 'EchoStdout' or 'EchoStderr'. -- -- If you use 'command' inside a @do@ block and do not use the result, you may get a compile-time error about being -- unable to deduce 'CmdResult'. To avoid this error, use 'command_'. -- -- By default the @stderr@ stream will be captured for use in error messages, and also echoed. To only echo -- pass @'WithStderr' 'False'@, which causes no streams to be captured by Shake, and certain programs (e.g. @gcc@) -- to detect they are running in a terminal. command :: (Partial, CmdResult r) => [CmdOption] -> String -> [String] -> Action r command opts x xs = withFrozenCallStack $ b <$> commandExplicitAction (Params "command" opts a x xs) where (a,b) = cmdResult -- | A version of 'command' where you do not require any results, used to avoid errors about being unable -- to deduce 'CmdResult'. command_ :: Partial => [CmdOption] -> String -> [String] -> Action () command_ opts x xs = withFrozenCallStack $ void $ commandExplicitAction (Params "command_" opts [] x xs) --------------------------------------------------------------------- -- VARIABLE ARGUMENT WRAPPER -- | A type annotation, equivalent to the first argument, but in variable argument contexts, -- gives a clue as to what return type is expected (not actually enforced). type a :-> t = a -- | Build or execute a system command. Before using 'cmd' to run a command, make sure you 'Development.Shake.need' any files -- that are used by the command. -- -- * @String@ arguments are treated as a list of whitespace separated arguments. -- -- * @[String]@ arguments are treated as a list of literal arguments. -- -- * 'CmdOption' arguments are used as options. -- -- * 'CmdArgument' arguments, which can be built by 'cmd' itself, are spliced into the containing command. -- -- Typically only string literals should be passed as @String@ arguments. When using variables -- prefer @[myvar]@ so that if @myvar@ contains spaces they are properly escaped. -- -- As some examples, here are some calls, and the resulting command string: -- -- @ -- 'cmd_' \"git log --pretty=\" \"oneline\" -- git log --pretty= oneline -- 'cmd_' \"git log --pretty=\" [\"oneline\"] -- git log --pretty= oneline -- 'cmd_' \"git log\" (\"--pretty=\" ++ \"oneline\") -- git log --pretty=oneline -- 'cmd_' \"git log\" (\"--pretty=\" ++ \"one line\") -- git log --pretty=one line -- 'cmd_' \"git log\" [\"--pretty=\" ++ \"one line\"] -- git log "--pretty=one line" -- @ -- -- More examples, including return values, see this translation of the examples given for the 'command' function: -- -- @ -- 'cmd_' \"gcc -c myfile.c\" -- compile a file, throwing an exception on failure -- 'Exit' c <- 'cmd' \"gcc -c\" [myfile] -- run a command, recording the exit code -- ('Exit' c, 'Stderr' err) <- 'cmd' \"gcc -c myfile.c\" -- run a command, recording the exit code and error output -- 'Stdout' out <- 'cmd' \"gcc -MM myfile.c\" -- run a command, recording the output -- 'cmd' ('Cwd' \"generated\") \"gcc -c\" [myfile] :: 'Action' () -- run a command in a directory -- -- let gccCommand = 'cmd' \"gcc -c\" :: 'CmdArgument' -- build a sub-command. 'cmd' can return 'CmdArgument' values as well as execute commands -- cmd ('Cwd' \"generated\") gccCommand [myfile] -- splice that command into a greater command -- @ -- -- If you use 'cmd' inside a @do@ block and do not use the result, you may get a compile-time error about being -- unable to deduce 'CmdResult'. To avoid this error, use 'cmd_'. If you enable @OverloadedStrings@ or @OverloadedLists@ -- you may have to give type signatures to the arguments, or use the more constrained 'command' instead. -- -- The 'cmd' function can also be run in the 'IO' monad, but then 'Traced' is ignored and command lines are not echoed. -- As an example: -- -- @ -- 'cmd' ('Cwd' \"generated\") 'Shell' \"gcc -c myfile.c\" :: IO () -- @ cmd :: (Partial, CmdArguments args) => args :-> Action r cmd = withFrozenCallStack $ cmdArguments mempty -- | See 'cmd'. Same as 'cmd' except with a unit result. -- 'cmd' is to 'cmd_' as 'command' is to 'command_'. cmd_ :: (Partial, CmdArguments args, Unit args) => args :-> Action () cmd_ = withFrozenCallStack cmd -- | The arguments to 'cmd' - see 'cmd' for examples and semantics. newtype CmdArgument = CmdArgument [Either CmdOption String] deriving (Eq, Semigroup, Monoid, Show) -- | The arguments to 'cmd' - see 'cmd' for examples and semantics. class CmdArguments t where -- | Arguments to cmd cmdArguments :: Partial => CmdArgument -> t instance (IsCmdArgument a, CmdArguments r) => CmdArguments (a -> r) where cmdArguments xs x = cmdArguments $ xs `mappend` toCmdArgument x instance CmdResult r => CmdArguments (Action r) where cmdArguments (CmdArgument x) = case partitionEithers x of (opts, x:xs) -> let (a,b) = cmdResult in b <$> commandExplicitAction (Params "cmd" opts a x xs) _ -> error "Error, no executable or arguments given to Development.Shake.cmd" instance CmdResult r => CmdArguments (IO r) where cmdArguments (CmdArgument x) = case partitionEithers x of (opts, x:xs) -> let (a,b) = cmdResult in b <$> commandExplicitIO (Params "cmd" opts a x xs) _ -> error "Error, no executable or arguments given to Development.Shake.cmd" instance CmdArguments CmdArgument where cmdArguments = id -- | Class to convert an a to a CmdArgument class IsCmdArgument a where -- | Conversion to a CmdArgument toCmdArgument :: a -> CmdArgument instance IsCmdArgument () where toCmdArgument = mempty instance IsCmdArgument String where toCmdArgument = CmdArgument . map Right . words instance IsCmdArgument [String] where toCmdArgument = CmdArgument . map Right instance IsCmdArgument (NonEmpty String) where toCmdArgument = toCmdArgument . toList instance IsCmdArgument CmdOption where toCmdArgument = CmdArgument . pure . Left instance IsCmdArgument [CmdOption] where toCmdArgument = CmdArgument . map Left instance IsCmdArgument CmdArgument where toCmdArgument = id instance IsCmdArgument a => IsCmdArgument (Maybe a) where toCmdArgument = maybe mempty toCmdArgument --------------------------------------------------------------------- -- UTILITIES -- A better version of showCommandForUser, which doesn't escape so much on Windows showCommandForUser2 :: FilePath -> [String] -> String showCommandForUser2 cmd args = unwords $ map (\x -> if safe x then x else showCommandForUser x []) $ cmd : args where safe xs = xs /= "" && not (any bad xs) bad x = isSpace x || (x == '\\' && not isWindows) || x `elem` ("\"\'" :: String) shake-0.19.8/src/Development/Shake/Config.hs0000644000000000000000000001110207346545000016735 0ustar0000000000000000{-# LANGUAGE DeriveDataTypeable, GeneralizedNewtypeDeriving, TypeFamilies #-} -- | A module for parsing and using config files in a Shake build system. Config files -- consist of variable bindings, for example: -- -- > # This is my Config file -- > HEADERS_DIR = /path/to/dir -- > CFLAGS = -g -I${HEADERS_DIR} -- > CFLAGS = $CFLAGS -O2 -- > include extra/file.cfg -- -- This defines the variable @HEADERS_DIR@ (equal to @\/path\/to\/dir@), and -- @CFLAGS@ (equal to @-g -I\/path\/to\/dir -O2@), and also includes the configuration -- statements in the file @extra/file.cfg@. The full lexical syntax for configuration -- files is defined here: . -- The use of Ninja file syntax is due to convenience and the desire to reuse an -- externally-defined specification (but the choice of configuration language is mostly arbitrary). -- -- To use the configuration file either use 'readConfigFile' to parse the configuration file -- and use the values directly, or 'usingConfigFile' and 'getConfig' to track the configuration -- values, so they become build dependencies. module Development.Shake.Config( readConfigFile, readConfigFileWithEnv, usingConfigFile, usingConfig, getConfig, getConfigKeys ) where import Development.Shake import Development.Shake.Classes import qualified Development.Ninja.Parse as Ninja import qualified Development.Ninja.Env as Ninja import qualified Data.HashMap.Strict as Map import qualified Data.ByteString.UTF8 as UTF8 import Data.Tuple.Extra import Data.List -- | Read a config file, returning a list of the variables and their bindings. -- Config files use the Ninja lexical syntax: -- readConfigFile :: FilePath -> IO (Map.HashMap String String) readConfigFile = readConfigFileWithEnv [] -- | Read a config file with an initial environment, returning a list of the variables and their bindings. -- Config files use the Ninja lexical syntax: -- readConfigFileWithEnv :: [(String, String)] -> FilePath -> IO (Map.HashMap String String) readConfigFileWithEnv vars file = do env <- Ninja.newEnv mapM_ (uncurry (Ninja.addEnv env) . (UTF8.fromString *** UTF8.fromString)) vars Ninja.parse file env mp <- Ninja.fromEnv env pure $ Map.fromList $ map (UTF8.toString *** UTF8.toString) $ Map.toList mp newtype Config = Config String deriving (Show,Typeable,Eq,Hashable,Binary,NFData) newtype ConfigKeys = ConfigKeys () deriving (Show,Typeable,Eq,Hashable,Binary,NFData) type instance RuleResult Config = Maybe String type instance RuleResult ConfigKeys = [String] -- | Specify the file to use with 'getConfig'. usingConfigFile :: FilePath -> Rules () usingConfigFile file = do mp <- newCache $ \() -> do need [file] liftIO $ readConfigFile file addOracle $ \(Config x) -> Map.lookup x <$> mp () addOracle $ \(ConfigKeys ()) -> sort . Map.keys <$> mp () pure () -- | Specify the values to use with 'getConfig', generally prefer -- 'usingConfigFile' unless you also need access to the values -- of variables outside 'Action'. usingConfig :: Map.HashMap String String -> Rules () usingConfig mp = do addOracle $ \(Config x) -> pure $ Map.lookup x mp addOracle $ \(ConfigKeys ()) -> pure $ sort $ Map.keys mp pure () -- | Obtain the value of a configuration variable, returns 'Nothing' to indicate the variable -- has no binding. Any build system using 'getConfig' /must/ call either 'usingConfigFile' -- or 'usingConfig'. The 'getConfig' function will introduce a dependency on the configuration -- variable (but not the whole configuration file), and if the configuration variable changes, the rule will be rerun. -- As an example: -- -- @ -- 'usingConfigFile' \"myconfiguration.cfg\" -- \"*.o\" '%>' \\out -> do -- cflags <- 'getConfig' \"CFLAGS\" -- 'cmd' \"gcc\" [out '-<.>' \"c\"] (fromMaybe \"\" cflags) -- @ getConfig :: String -> Action (Maybe String) getConfig = askOracle . Config -- | Obtain the configuration keys. -- Any build system using 'getConfigKeys' /must/ call either 'usingConfigFile' or 'usingConfig'. -- The 'getConfigKeys' function will introduce a dependency on the configuration keys -- (but not the whole configuration file), and if the configuration keys change, the rule will be rerun. -- Usually use as part of an action. -- As an example: -- -- @ -- 'usingConfigFile' \"myconfiguration.cfg\" -- 'action' $ need =<< getConfigKeys -- @ getConfigKeys :: Action [String] getConfigKeys = askOracle $ ConfigKeys () shake-0.19.8/src/Development/Shake/Database.hs0000644000000000000000000001554107346545000017247 0ustar0000000000000000{-# LANGUAGE LambdaCase #-} {-# LANGUAGE RecordWildCards #-} -- | Lower-level primitives to drive Shake, which are wrapped into the -- 'Development.Shake.shake' function. Useful if you want to perform multiple Shake -- runs in a row without reloading from the database. -- Sometimes used in conjunction with @'shakeFiles'=\"\/dev\/null\"@. -- Using these functions you can approximate the 'Development.Shake.shake' experience with: -- -- @ -- shake opts rules = do -- (_, after) \<- 'shakeWithDatabase' opts rules $ \\db -> do -- 'shakeOneShotDatabase' db -- 'shakeRunDatabase' db [] -- 'shakeRunAfter' opts after -- @ module Development.Shake.Database( ShakeDatabase, shakeOpenDatabase, shakeWithDatabase, shakeOneShotDatabase, shakeRunDatabase, shakeLiveFilesDatabase, shakeProfileDatabase, shakeErrorsDatabase, shakeRunAfter ) where import Control.Concurrent.Extra import Control.Exception import Control.Monad import Control.Monad.IO.Class import Data.IORef import General.Cleanup import Development.Shake.Internal.Errors import Development.Shake.Internal.Options import Development.Shake.Internal.Core.Rules import Development.Shake.Internal.Core.Run import Development.Shake.Internal.Core.Types import Development.Shake.Internal.Rules.Default data UseState = Closed | Using String | Open {openOneShot :: Bool, openRequiresReset :: Bool} -- | The type of an open Shake database. Created with -- 'shakeOpenDatabase' or 'shakeWithDatabase'. Used with -- 'shakeRunDatabase'. You may not execute simultaneous calls using 'ShakeDatabase' -- on separate threads (it will raise an error). data ShakeDatabase = ShakeDatabase (Var UseState) RunState -- | Given some options and rules, return a pair. The first component opens the database, -- the second cleans it up. The creation /does not/ need to be run masked, because the -- cleanup is able to run at any point. Most users should prefer 'shakeWithDatabase' -- which handles exceptions duration creation properly. shakeOpenDatabase :: ShakeOptions -> Rules () -> IO (IO ShakeDatabase, IO ()) shakeOpenDatabase opts rules = do (cleanup, clean) <- newCleanup use <- newVar $ Open False False let alloc = withOpen use "shakeOpenDatabase" id $ \_ -> ShakeDatabase use <$> open cleanup opts (rules >> defaultRules) let free = do modifyVar_ use $ \case Using s -> throwM $ errorStructured "Error when calling shakeOpenDatabase close function, currently running" [("Existing call", Just s)] "" _ -> pure Closed clean pure (alloc, free) withOpen :: Var UseState -> String -> (UseState -> UseState) -> (UseState -> IO a) -> IO a withOpen var name final act = mask $ \restore -> do o <- modifyVar var $ \case Using s -> throwM $ errorStructured ("Error when calling " ++ name ++ ", currently running") [("Existing call", Just s)] "" Closed -> throwM $ errorStructured ("Error when calling " ++ name ++ ", already closed") [] "" o@Open{} -> pure (Using name, o) let clean = writeVar var $ final o res <- restore (act o) `onException` clean clean pure res -- | Declare that a just-openned database will be used to call 'shakeRunDatabase' at most once. -- If so, an optimisation can be applied to retain less memory. shakeOneShotDatabase :: ShakeDatabase -> IO () shakeOneShotDatabase (ShakeDatabase use _) = withOpen use "shakeOneShotDatabase" (\o -> o{openOneShot=True}) $ \_ -> pure () -- | Given some options and rules, create a 'ShakeDatabase' that can be used to run -- executions. shakeWithDatabase :: ShakeOptions -> Rules () -> (ShakeDatabase -> IO a) -> IO a shakeWithDatabase opts rules act = do (db, clean) <- shakeOpenDatabase opts rules (act =<< db) `finally` clean -- | Given a 'ShakeDatabase', what files did the execution ensure were up-to-date -- in the previous call to 'shakeRunDatabase'. Corresponds to the list of files -- written out to 'shakeLiveFiles'. shakeLiveFilesDatabase :: ShakeDatabase -> IO [FilePath] shakeLiveFilesDatabase (ShakeDatabase use s) = withOpen use "shakeLiveFilesDatabase" id $ \_ -> liveFilesState s -- | Given a 'ShakeDatabase', generate profile information to the given file about the latest run. -- See 'shakeReport' for the types of file that can be generated. shakeProfileDatabase :: ShakeDatabase -> FilePath -> IO () shakeProfileDatabase (ShakeDatabase use s) file = withOpen use "shakeProfileDatabase" id $ \_ -> profileState s file -- | Given a 'ShakeDatabase', what files did the execution reach an error on last time. -- Some special considerations when using this function: -- -- * The presence of an error does not mean the build will fail, specifically if a -- previously required dependency was run and raised an error, then the thing that previously -- required it will be run. If the build system has changed in an untracked manner, -- the build may succeed this time round. -- -- * If the previous run actually failed then 'shakeRunDatabase' will have thrown an exception. -- You probably want to catch that exception so you can make the call to 'shakeErrorsDatabase'. -- -- * You may see a single failure reported multiple times, with increasingly large call stacks, showing -- the ways in which the error lead to further errors throughout. -- -- * The 'SomeException' values are highly likely to be of type 'ShakeException'. -- -- * If you want as many errors as possible in one run set @'shakeStaunch'=True@. shakeErrorsDatabase :: ShakeDatabase -> IO [(String, SomeException)] shakeErrorsDatabase (ShakeDatabase use s) = withOpen use "shakeErrorsDatabase" id $ \_ -> errorsState s -- | Given an open 'ShakeDatabase', run both whatever actions were added to the 'Rules', -- plus the list of 'Action' given here. Returns the results from the explicitly passed -- actions along with a list of actions to run after the database was closed, as added with -- 'Development.Shake.runAfter' and 'Development.Shake.removeFilesAfter'. shakeRunDatabase :: ShakeDatabase -> [Action a] -> IO ([a], [IO ()]) shakeRunDatabase (ShakeDatabase use s) as = withOpen use "shakeRunDatabase" (\o -> o{openRequiresReset=True}) $ \Open{..} -> do when openRequiresReset $ do when openOneShot $ throwM $ errorStructured "Error when calling shakeRunDatabase twice, after calling shakeOneShotDatabase" [] "" reset s (refs, as) <- fmap unzip $ forM as $ \a -> do ref <- newIORef Nothing pure (ref, liftIO . writeIORef ref . Just =<< a) after <- run s openOneShot $ map void as results <- mapM readIORef refs case sequence results of Just result -> pure (result, after) Nothing -> throwM $ errorInternal "Expected all results were written, but some where not" shake-0.19.8/src/Development/Shake/FilePath.hs0000644000000000000000000001342507346545000017236 0ustar0000000000000000 -- | A module for 'FilePath' operations exposing "System.FilePath" plus some additional operations. -- -- /Windows note:/ The extension methods ('<.>', 'takeExtension' etc) use the Posix variants since on -- Windows @\"\/\/\*\" '<.>' \"txt\"@ produces @\"\/\/\*\\\\.txt\"@ -- (which is bad for 'Development.Shake.FilePattern' values). module Development.Shake.FilePath( module System.FilePath, module System.FilePath.Posix, dropDirectory1, takeDirectory1, replaceDirectory1, makeRelativeEx, normaliseEx, toNative, toStandard, exe ) where import System.Directory (canonicalizePath) import System.Info.Extra import Data.List.Extra import Data.Maybe import qualified System.FilePath as Native import System.FilePath hiding (splitExtension, takeExtension, replaceExtension, dropExtension, addExtension ,hasExtension, (<.>), splitExtensions, takeExtensions, dropExtensions ) import System.FilePath.Posix (splitExtension, takeExtension, replaceExtension, dropExtension, addExtension ,hasExtension, (<.>), splitExtensions, takeExtensions, dropExtensions ) -- | Drop the first directory from a 'FilePath'. Should only be used on -- relative paths. -- -- > dropDirectory1 "aaa/bbb" == "bbb" -- > dropDirectory1 "aaa/" == "" -- > dropDirectory1 "aaa" == "" -- > dropDirectory1 "" == "" dropDirectory1 :: FilePath -> FilePath dropDirectory1 = drop1 . dropWhile (not . isPathSeparator) -- | Take the first component of a 'FilePath'. Should only be used on -- relative paths. -- -- > takeDirectory1 "aaa/bbb" == "aaa" -- > takeDirectory1 "aaa/" == "aaa" -- > takeDirectory1 "aaa" == "aaa" takeDirectory1 :: FilePath -> FilePath takeDirectory1 = takeWhile (not . isPathSeparator) -- | Replace the first component of a 'FilePath'. Should only be used on -- relative paths. -- -- > replaceDirectory1 "root/file.ext" "directory" == "directory/file.ext" -- > replaceDirectory1 "root/foo/bar/file.ext" "directory" == "directory/foo/bar/file.ext" replaceDirectory1 :: FilePath -> String -> FilePath replaceDirectory1 x dir = dir dropDirectory1 x -- | Make a path relative. Returns Nothing only when the given paths are on -- different drives. This will try the pure function makeRelative first. If that -- fails, the paths are canonicalised (removing any indirection and symlinks) -- and a relative path is derived from there. -- -- > > -- Given that "/root/a/" is not a symlink -- > > makeRelativeEx "/root/a/" "/root/b/file.out" -- > Just "../b/file.out" -- > -- > > -- Given that "/root/c/" is a symlink to "/root/e/f/g/" -- > > makeRelativeEx "/root/c/" "/root/b/file.out" -- > Just "../../../b/file.out" -- > -- > > -- On Windows -- > > makeRelativeEx "C:\\foo" "D:\\foo\\bar" -- > Nothing -- makeRelativeEx :: FilePath -> FilePath -> IO (Maybe FilePath) makeRelativeEx pathA pathB | isRelative makeRelativePathAPathB = pure (Just makeRelativePathAPathB) | otherwise = do a' <- canonicalizePath pathA b' <- canonicalizePath pathB if takeDrive a' /= takeDrive b' then pure Nothing else Just <$> makeRelativeEx' a' b' where makeRelativePathAPathB = makeRelative pathA pathB makeRelativeEx' :: FilePath -> FilePath -> IO FilePath makeRelativeEx' a b = do let rel = makeRelative a b parent = takeDirectory a if isRelative rel then pure rel else if a /= parent then do parentToB <- makeRelativeEx' parent b pure (".." parentToB) -- Impossible: makeRelative should have succeeded in finding -- a relative path once `a == "/"`. else error $ "Error calculating relative path from \"" ++ pathA ++ "\" to \"" ++ show pathB ++ "\"" -- | Normalise a 'FilePath', applying the rules: -- -- * All 'pathSeparators' become 'pathSeparator' (@\/@ on Linux, @\\@ on Windows) -- -- * @foo\/bar\/..\/baz@ becomes @foo\/baz@ (not universally true in the presence of symlinks) -- -- * @foo\/.\/bar@ becomes @foo\/bar@ -- -- * @foo\/\/bar@ becomes @foo\/bar@ -- -- This function is not based on the 'normalise' function from the @filepath@ library, as that function -- is quite broken. normaliseEx :: FilePath -> FilePath normaliseEx xs | a:b:xs <- xs, isWindows && sep a && sep b = '/' : f ('/':xs) -- account for UNC paths being double // | otherwise = f xs where sep = Native.isPathSeparator f o = toNative $ deslash o $ (++"/") $ concatMap ('/':) $ reverse $ g 0 $ reverse $ split o deslash o x | x == "/" = case (pre,pos) of (True,True) -> "/" (True,False) -> "/." (False,True) -> "./" (False,False) -> "." | otherwise = (if pre then id else drop1) $ (if pos then id else init) x where pre = sep $ fromMaybe ' ' $ listToMaybe o pos = sep $ last $ " " ++ o g i [] = replicate i ".." g i ("..":xs) = g (i+1) xs g i (".":xs) = g i xs g 0 (x:xs) = x : g 0 xs g i (_:xs) = g (i-1) xs -- equivalent to eliminating ../x split xs = if null ys then [] else a : split b where (a,b) = break sep ys ys = dropWhile sep xs -- | Convert to native path separators, namely @\\@ on Windows. toNative :: FilePath -> FilePath toNative = if isWindows then map (\x -> if x == '/' then '\\' else x) else id -- | Convert all path separators to @/@, even on Windows. toStandard :: FilePath -> FilePath toStandard = if isWindows then map (\x -> if x == '\\' then '/' else x) else id -- | The extension of executables, @\"exe\"@ on Windows and @\"\"@ otherwise. exe :: String exe = if isWindows then "exe" else "" shake-0.19.8/src/Development/Shake/Forward.hs0000644000000000000000000001643207346545000017147 0ustar0000000000000000{-# LANGUAGE GeneralizedNewtypeDeriving, DeriveDataTypeable, Rank2Types, ScopedTypeVariables, ViewPatterns #-} {-# LANGUAGE TypeFamilies #-} -- | A module for producing forward-defined build systems, in contrast to standard backwards-defined -- build systems such as shake. Based around ideas from . -- As an example: -- -- @ -- import "Development.Shake" -- import "Development.Shake.Forward" -- import "Development.Shake.FilePath" -- -- main = 'shakeArgsForward' 'shakeOptions' $ do -- contents <- 'readFileLines' \"result.txt\" -- 'cache' $ 'cmd' \"tar -cf result.tar\" contents -- @ -- -- Compared to backward-defined build systems (such as normal Shake), forward-defined build -- systems tend to be simpler for simple systems (less boilerplate, more direct style), but more -- complex for larger build systems (requires explicit parallelism, explicit sharing of build products, -- no automatic command line targets). As a general approach for writing forward-defined systems: -- -- * Figure out the sequence of system commands that will build your project. -- -- * Write a simple 'Action' that builds your project. -- -- * Insert 'cache' in front of most system commands. -- -- * Replace most loops with 'forP', where they can be executed in parallel. -- -- * Where Haskell performs real computation, if zero-build performance is insufficient, use 'cacheAction'. -- -- All forward-defined systems use 'AutoDeps', which requires @fsatrace@ to be on the @$PATH@. -- You can obtain @fsatrace@ from . You must set -- 'shakeLintInside' to specify where 'AutoDeps' will look for dependencies - if you want all dependencies -- everywhere use @[\"\"]@. -- -- This module is considered experimental - it has not been battle tested. There are now a few possible -- alternatives in this space: -- -- * Pier (built on Shake). -- -- * Rattle (by the same author as Shake). -- -- * Stroll . module Development.Shake.Forward( shakeForward, shakeArgsForward, forwardOptions, forwardRule, cache, cacheAction, cacheActionWith, ) where import Control.Monad import Development.Shake import Development.Shake.Rule import Development.Shake.Command import Development.Shake.Classes import Development.Shake.FilePath import Data.IORef.Extra import Data.Either import Data.Typeable import Data.List.Extra import Control.Exception.Extra import Numeric import System.IO.Unsafe import Data.Binary import qualified Data.ByteString as BS import qualified Data.ByteString.Lazy as LBS import qualified Data.HashMap.Strict as Map {-# NOINLINE forwards #-} forwards :: IORef (Map.HashMap Forward (Action Forward)) forwards = unsafePerformIO $ newIORef Map.empty -- I'd like to use TypeRep, but it doesn't have any instances in older versions newtype Forward = Forward (String, String, BS.ByteString) -- the type, the Show, the payload deriving (Hashable,Typeable,Eq,NFData,Binary) mkForward :: (Typeable a, Show a, Binary a) => a -> Forward mkForward x = Forward (show $ typeOf x, show x, encode' x) unForward :: forall a . (Typeable a, Binary a) => Forward -> a unForward (Forward (got,_,x)) | got /= want = error $ "Failed to match forward type, wanted " ++ show want ++ ", got " ++ show got | otherwise = decode' x where want = show $ typeRep (Proxy :: Proxy a) encode' :: Binary a => a -> BS.ByteString encode' = BS.concat . LBS.toChunks . encode decode' :: Binary a => BS.ByteString -> a decode' = decode . LBS.fromChunks . pure type instance RuleResult Forward = Forward instance Show Forward where show (Forward (_,x,_)) = x -- | Run a forward-defined build system. shakeForward :: ShakeOptions -> Action () -> IO () shakeForward opts act = shake (forwardOptions opts) (forwardRule act) -- | Run a forward-defined build system, interpreting command-line arguments. shakeArgsForward :: ShakeOptions -> Action () -> IO () shakeArgsForward opts act = shakeArgs (forwardOptions opts) (forwardRule act) -- | Given an 'Action', turn it into a 'Rules' structure which runs in forward mode. forwardRule :: Action () -> Rules () forwardRule act = do opts <- getShakeOptionsRules when (null $ shakeLintInside opts) $ fail "When running in forward mode you must set shakeLintInside to specify where to detect dependencies" addBuiltinRule noLint noIdentity $ \k old mode -> case old of Just old | mode == RunDependenciesSame -> pure $ RunResult ChangedNothing old (decode' old) _ -> do res <- liftIO $ atomicModifyIORef forwards $ \mp -> (Map.delete k mp, Map.lookup k mp) case res of Nothing -> liftIO $ errorIO $ "Failed to find action name, " ++ show k Just act -> do new <- act pure $ RunResult ChangedRecomputeSame (encode' new) new action act -- | Given a 'ShakeOptions', set the options necessary to execute in forward mode. forwardOptions :: ShakeOptions -> ShakeOptions forwardOptions opts = opts{shakeCommandOptions=[AutoDeps]} -- | Cache an action, given a key and an 'Action'. Each call in your program should specify a different -- key, but the key should remain consistent between runs. Ideally, the 'Action' will gather all its dependencies -- with tracked operations, e.g. 'readFile\''. However, if information is accessed from the environment -- (e.g. the action is a closure), you should call 'cacheActionWith' being explicit about what is captured. cacheAction :: (Typeable a, Binary a, Show a, Typeable b, Binary b, Show b) => a -> Action b -> Action b cacheAction (mkForward -> key) (action :: Action b) = do liftIO $ atomicModifyIORef_ forwards $ Map.insert key (mkForward <$> action) res <- apply1 key liftIO $ atomicModifyIORef_ forwards $ Map.delete key pure $ unForward res newtype With a = With a deriving (Typeable, Binary, Show) -- | Like 'cacheAction', but also specify which information is captured by the closure of the 'Action'. If that -- information changes, the 'Action' will be rerun. cacheActionWith :: (Typeable a, Binary a, Show a, Typeable b, Binary b, Show b, Typeable c, Binary c, Show c) => a -> b -> Action c -> Action c cacheActionWith key argument action = do cacheAction (With argument) $ do alwaysRerun pure argument cacheAction key $ do apply1 $ mkForward $ With argument action -- | Apply caching to an external command using the same arguments as 'cmd'. -- -- > cache $ cmd "gcc -c" ["foo.c"] "-o" ["foo.o"] -- -- This command will be cached, with the inputs/outputs traced. If any of the -- files used by this command (e.g. @foo.c@ or header files it imports) then -- the command will rerun. cache :: (forall r . CmdArguments r => r) -> Action () cache cmd = do let CmdArgument args = cmd let isDull ['-',_] = True; isDull _ = False let name = headDef "unknown" $ filter (not . isDull) $ drop1 $ rights args cacheAction (Command $ toStandard name ++ " #" ++ upper (showHex (abs $ hash $ show args) "")) cmd newtype Command = Command String deriving (Typeable, Binary) instance Show Command where show (Command x) = "command " ++ x shake-0.19.8/src/Development/Shake/Internal/0000755000000000000000000000000007346545000016755 5ustar0000000000000000shake-0.19.8/src/Development/Shake/Internal/Args.hs0000644000000000000000000005731307346545000020216 0ustar0000000000000000{-# LANGUAGE TupleSections #-} -- | Command line parsing flags. module Development.Shake.Internal.Args( shakeOptDescrs, shake, shakeArgs, shakeArgsWith, shakeArgsOptionsWith ) where import Development.Shake.Internal.Paths import Development.Shake.Internal.Options import Development.Shake.Internal.Core.Rules import Development.Shake.Internal.Errors import Development.Shake.Internal.CompactUI import Development.Shake.Internal.Demo import Development.Shake.Internal.Core.Action import Development.Shake.FilePath import Development.Shake.Internal.Rules.File import Development.Shake.Internal.Progress import Development.Shake.Database import General.Timing import General.Extra import General.Thread import General.GetOpt import General.EscCodes import Data.Tuple.Extra import Control.DeepSeq import Control.Exception.Extra import Control.Monad import Data.Either import Data.List.Extra import Data.Maybe import System.Directory.Extra import System.Environment import System.Exit import System.Time.Extra -- | Main entry point for running Shake build systems. For an example see the top of the module "Development.Shake". -- Use 'ShakeOptions' to specify how the system runs, and 'Rules' to specify what to build. The function will throw -- an exception if the build fails. -- -- To use command line flags to modify 'ShakeOptions' see 'shakeArgs'. shake :: ShakeOptions -> Rules () -> IO () shake opts rules = do addTiming "Function shake" (_, after) <- shakeWithDatabase opts rules $ \db -> do shakeOneShotDatabase db shakeRunDatabase db [] shakeRunAfter opts after -- | Run a build system using command line arguments for configuration. -- The available flags are those from 'shakeOptDescrs', along with a few additional -- @make@ compatible flags that are not represented in 'ShakeOptions', such as @--print-directory@. -- If there are no file arguments then the 'Rules' are used directly, otherwise the file arguments -- are 'want'ed (after calling 'withoutActions'). As an example: -- -- @ -- main = 'shakeArgs' 'shakeOptions'{'shakeFiles' = \"_make\", 'shakeProgress' = 'progressSimple'} $ do -- 'phony' \"clean\" $ 'Development.Shake.removeFilesAfter' \"_make\" [\"\/\/*\"] -- 'want' [\"_make\/neil.txt\",\"_make\/emily.txt\"] -- \"_make\/*.txt\" '%>' \\out -> -- ... build action here ... -- @ -- -- This build system will default to building @neil.txt@ and @emily.txt@, while showing progress messages, -- and putting the Shake files in locations such as @_make\/.database@. Some example command line flags: -- -- * @main --no-progress@ will turn off progress messages. -- -- * @main -j6@ will build on 6 threads. -- -- * @main --help@ will display a list of supported flags. -- -- * @main clean@ will not build anything, but will remove the @_make@ directory, including the -- any 'shakeFiles'. -- -- * @main _make/henry.txt@ will not build @neil.txt@ or @emily.txt@, but will instead build @henry.txt@. shakeArgs :: ShakeOptions -> Rules () -> IO () shakeArgs opts rules = shakeArgsWith opts [] f where f _ files = pure $ Just $ if null files then rules else want files >> withoutActions rules -- | A version of 'shakeArgs' with more flexible handling of command line arguments. -- The caller of 'shakeArgsWith' can add additional flags (the second argument) and chose how to convert -- the flags/arguments into rules (the third argument). Given: -- -- @ -- 'shakeArgsWith' opts flags (\\flagValues argValues -> result) -- @ -- -- * @opts@ is the initial 'ShakeOptions' value, which may have some fields overridden by command line flags. -- This argument is usually 'shakeOptions', perhaps with a few fields overridden. -- -- * @flags@ is a list of flag descriptions, which either produce a 'String' containing an error -- message (typically for flags with invalid arguments, .e.g. @'Left' \"could not parse as int\"@), or a value -- that is passed as @flagValues@. If you have no custom flags, pass @[]@. -- -- * @flagValues@ is a list of custom flags that the user supplied. If @flags == []@ then this list will -- be @[]@. -- -- * @argValues@ is a list of non-flag arguments, which are often treated as files and passed to 'want'. -- If arguments are specified then typically the 'want' calls from the rules are discarded using 'withoutActions'. -- -- * @result@ should produce a 'Nothing' to indicate that no building needs to take place, or a 'Just' -- providing the rules that should be used. -- -- As an example of a build system that can use either @gcc@ or @distcc@ for compiling: -- -- @ -- import System.Console.GetOpt -- -- data Flags = DistCC deriving Eq -- flags = [Option \"\" [\"distcc\"] (NoArg $ Right DistCC) \"Run distributed.\"] -- -- main = 'shakeArgsWith' 'shakeOptions' flags $ \\flags targets -> pure $ Just $ do -- let compiler = if DistCC \`elem\` flags then \"distcc\" else \"gcc\" -- let rules = do -- \"*.o\" '%>' \\out -> do -- 'need' ... -- 'cmd' compiler ... -- 'want' [\"target.exe\"] -- ... -- if null targets then rules else 'want' targets >> 'withoutActions' rules -- @ -- -- Now you can pass @--distcc@ to use the @distcc@ compiler. shakeArgsWith :: ShakeOptions -> [OptDescr (Either String a)] -> ([a] -> [String] -> IO (Maybe (Rules ()))) -> IO () shakeArgsWith opt args f = shakeArgsOptionsWith opt args $ \so a b -> fmap (so,) <$> f a b -- | Like 'shakeArgsWith', but also lets you manipulate the 'ShakeOptions'. shakeArgsOptionsWith :: ShakeOptions -> [OptDescr (Either String a)] -> (ShakeOptions -> [a] -> [String] -> IO (Maybe (ShakeOptions, Rules ()))) -> IO () shakeArgsOptionsWith baseOpts userOptions rules = do addTiming "shakeArgsWith" let baseOpts2 = removeOverlap userOptions $ map snd shakeOptsEx args <- getArgs let (flag1,files,errs) = getOpt (baseOpts2 `mergeOptDescr` userOptions) args (self,user) = partitionEithers flag1 (flagsExtra,flagsShake) = first concat $ unzip self progressReplays = [x | ProgressReplay x <- flagsExtra] progressRecords = [x | ProgressRecord x <- flagsExtra] changeDirectory = listToMaybe [x | ChangeDirectory x <- flagsExtra] printDirectory = lastDef False [x | PrintDirectory x <- flagsExtra] shareRemoves = [x | ShareRemove x <- flagsExtra] oshakeOpts = foldl' (flip ($)) baseOpts flagsShake lintInside <- mapM canonicalizePath $ shakeLintInside oshakeOpts let shakeOpts = oshakeOpts {shakeLintInside = map (toStandard . addTrailingPathSeparator) lintInside ,shakeLintIgnore = map toStandard $ shakeLintIgnore oshakeOpts ,shakeOutput = if shakeColor oshakeOpts then outputColor (shakeOutput oshakeOpts) else shakeOutput oshakeOpts } let putWhen v msg = when (shakeVerbosity oshakeOpts >= v) $ shakeOutput oshakeOpts v msg let putWhenLn v msg = putWhen v $ msg ++ "\n" let showHelp long = do progName <- getProgName (targets, helpSuffix) <- if not long then pure ([], []) else handleSynchronous (\e -> do putWhenLn Info $ "Failure to collect targets: " ++ show e; pure ([], [])) $ do -- run the rules as simply as we can rs <- rules shakeOpts [] [] case rs of Just (_, rs) -> do xs <- getTargets shakeOpts rs helpSuffix <- getHelpSuffix shakeOpts rs evaluate $ force ([" - " ++ a ++ maybe "" (" - " ++) b | (a,b) <- xs], helpSuffix) _ -> pure ([], []) changes<- pure $ let as = shakeOptionsFields baseOpts bs = shakeOptionsFields oshakeOpts in [" - " ++ lbl ++ ": " ++ v1 ++ " => " ++ v2 | long, ((lbl, v1), (_, v2)) <- zip as bs, v1 /= v2] putWhen Error $ unlines $ ("Usage: " ++ progName ++ " [options] [target] ...") : (if null baseOpts2 then [] else "" : (if null userOptions then "Options:" else "Standard options:") : showOptDescr baseOpts2) ++ (if null userOptions then [] else "" : "Extra options:" : showOptDescr userOptions) ++ (if null changes then [] else "" : "Changed ShakeOptions:" : changes) ++ (if null targets then [] else "" : "Targets:" : targets) ++ (if null helpSuffix then [] else "" : helpSuffix) when (errs /= []) $ do putWhen Error $ unlines $ map ("shake: " ++) $ filter (not . null) $ lines $ unlines errs showHelp False exitFailure if Help `elem` flagsExtra then showHelp True else if Version `elem` flagsExtra then putWhenLn Info $ "Shake build system, version " ++ shakeVersionString else if NumericVersion `elem` flagsExtra then putWhenLn Info shakeVersionString else if Demo `elem` flagsExtra then demo $ shakeStaunch shakeOpts else if not $ null progressReplays then do dat <- forM progressReplays $ \file -> do src <- readFile file pure (file, map read $ lines src) forM_ (if null $ shakeReport shakeOpts then ["-"] else shakeReport shakeOpts) $ \file -> do putWhenLn Info $ "Writing report to " ++ file writeProgressReport file dat else do when (Sleep `elem` flagsExtra) $ sleep 1 start <- offsetTime initDataDirectory -- must be done before we start changing directory let redir = maybe id withCurrentDirectory changeDirectory shakeOpts <- if null progressRecords then pure shakeOpts else do t <- offsetTime pure shakeOpts{shakeProgress = \p -> void $ withThreadsBoth (shakeProgress shakeOpts p) $ progressDisplay 1 (const $ pure ()) $ do p <- p t <- t forM_ progressRecords $ \file -> appendFile file $ show (t,p) ++ "\n" pure p } (ran,shakeOpts,res) <- redir $ do when printDirectory $ do curdir <- getCurrentDirectory putWhenLn Info $ "shake: In directory `" ++ curdir ++ "'" (shakeOpts, ui) <- do let compact = lastDef No [x | Compact x <- flagsExtra] use <- if compact == Auto then checkEscCodes else pure $ compact == Yes if use then second withThreadSlave <$> compactUI shakeOpts else pure (shakeOpts, id) rules <- rules shakeOpts user files ui $ case rules of Nothing -> pure (False, shakeOpts, Right ()) Just (shakeOpts, rules) -> do res <- try_ $ shake shakeOpts $ if NoBuild `elem` flagsExtra then withoutActions rules else if ShareList `elem` flagsExtra || not (null shareRemoves) || ShareSanity `elem` flagsExtra then do action $ do unless (null shareRemoves) $ actionShareRemove shareRemoves when (ShareList `elem` flagsExtra) actionShareList when (ShareSanity `elem` flagsExtra) actionShareSanity withoutActions rules else rules pure (True, shakeOpts, res) if not ran || shakeVerbosity shakeOpts < Info || NoTime `elem` flagsExtra then either throwIO pure res else let esc = if shakeColor shakeOpts then escape else \_ x -> x in case res of Left err -> if Exception `elem` flagsExtra then throwIO err else do putWhenLn Error $ esc Red $ show err exitFailure Right () -> do tot <- start putWhenLn Info $ esc Green $ "Build completed in " ++ showDuration tot -- | A list of command line options that can be used to modify 'ShakeOptions'. Each option returns -- either an error message (invalid argument to the flag) or a function that changes some fields -- in 'ShakeOptions'. The command line flags are @make@ compatible where possbile, but additional -- flags have been added for the extra options Shake supports. shakeOptDescrs :: [OptDescr (Either String (ShakeOptions -> ShakeOptions))] shakeOptDescrs = [fmapFmapOptDescr snd o | (True, o) <- shakeOptsEx] data Extra = ChangeDirectory FilePath | Version | NumericVersion | PrintDirectory Bool | Help | Sleep | NoTime | Exception | NoBuild | ProgressRecord FilePath | ProgressReplay FilePath | Demo | ShareList | ShareSanity | ShareRemove String | Compact Auto deriving Eq data Auto = Yes | No | Auto deriving Eq escape :: Color -> String -> String escape color x = escForeground color ++ x ++ escNormal outputColor :: (Verbosity -> String -> IO ()) -> Verbosity -> String -> IO () outputColor output v msg = output v $ color msg where color = case v of Silent -> id Error -> escape Red Warn -> escape Yellow _ -> escape Blue -- | True if it has a potential effect on ShakeOptions shakeOptsEx :: [(Bool, OptDescr (Either String ([Extra], ShakeOptions -> ShakeOptions)))] shakeOptsEx = [opts $ Option "a" ["abbrev"] (reqArgPair "abbrev" "FULL=SHORT" $ \a s -> s{shakeAbbreviations=shakeAbbreviations s ++ [a]}) "Use abbreviation in status messages." ,opts $ Option "" ["allow-redefine-rules"] (noArg $ \s -> s{shakeAllowRedefineRules = True}) "Allow redefining built-in rules" ,opts $ Option "" ["no-allow-redefine-rules"] (noArg $ \s -> s{shakeAllowRedefineRules = False}) "Forbid redefining built-in rules (default)" ,extr $ Option "" ["no-build"] (noArg [NoBuild]) "Don't build anything." ,extr $ Option "C" ["directory"] (reqArg "DIRECTORY" $ \x -> [ChangeDirectory x]) "Change to DIRECTORY before doing anything." -- ,yes $ Option "" ["cloud"] (reqArg "URL" $ \x s -> s{shakeCloud=shakeCloud s ++ [x]}) "HTTP server providing a cloud cache." ,opts $ Option "" ["color","colour"] (noArg $ \s -> s{shakeColor=True}) "Colorize the output." ,opts $ Option "" ["no-color","no-colour"] (noArg $ \s -> s{shakeColor=False}) "Don't colorize the output." ,extr $ Option "" ["compact"] (optArgAuto "auto" "yes|no|auto" $ \x -> [Compact x]) "Use a compact Bazel/Buck style output." ,opts $ Option "d" ["debug"] (optArg "FILE" $ \x s -> s{shakeVerbosity=Diagnostic, shakeOutput=outputDebug (shakeOutput s) x}) "Print lots of debugging information." ,extr $ Option "" ["demo"] (noArg [Demo]) "Run in demo mode." ,opts $ Option "" ["digest"] (noArg $ \s -> s{shakeChange=ChangeDigest}) "Files change when digest changes." ,opts $ Option "" ["digest-and"] (noArg $ \s -> s{shakeChange=ChangeModtimeAndDigest}) "Files change when modtime and digest change." ,opts $ Option "" ["digest-and-input"] (noArg $ \s -> s{shakeChange=ChangeModtimeAndDigestInput}) "Files change on modtime (and digest for inputs)." ,opts $ Option "" ["digest-or"] (noArg $ \s -> s{shakeChange=ChangeModtimeOrDigest}) "Files change when modtime or digest change." ,opts $ Option "" ["digest-not"] (noArg $ \s -> s{shakeChange=ChangeModtime}) "Files change when modtime changes." ,extr $ Option "" ["exception"] (noArg [Exception]) "Throw exceptions directly." ,opts $ Option "" ["flush"] (reqIntArg 1 "flush" "N" (\i s -> s{shakeFlush=Just i})) "Flush metadata every N seconds." ,opts $ Option "" ["never-flush"] (noArg $ \s -> s{shakeFlush=Nothing}) "Never explicitly flush metadata." ,extr $ Option "h" ["help"] (noArg [Help]) "Print this message and exit." ,opts $ Option "j" ["jobs"] (optArgInt 0 "jobs" "N" $ \i s -> s{shakeThreads=fromMaybe 0 i}) "Allow N jobs/threads at once [default CPUs]." ,opts $ Option "k" ["keep-going"] (noArg $ \s -> s{shakeStaunch=True}) "Keep going when some targets can't be made." ,opts $ Option "l" ["lint"] (noArg $ \s -> s{shakeLint=Just LintBasic}) "Perform limited validation after the run." ,opts $ Option "" ["lint-watch"] (reqArg "PATTERN" $ \x s -> s{shakeLintWatch=shakeLintWatch s ++ [x]}) "Error if any of the patterns are created (expensive)." ,opts $ Option "" ["lint-fsatrace"] (optArg "DIR" $ \x s -> s{shakeLint=Just LintFSATrace, shakeLintInside=shakeLintInside s ++ [fromMaybe "." x]}) "Use fsatrace to do validation [in current dir]." ,opts $ Option "" ["lint-ignore"] (reqArg "PATTERN" $ \x s -> s{shakeLintIgnore=shakeLintIgnore s ++ [x]}) "Ignore any lint errors in these patterns." ,opts $ Option "" ["no-lint"] (noArg $ \s -> s{shakeLint=Nothing}) "Turn off --lint." ,opts $ Option "" ["live"] (optArg "FILE" $ \x s -> s{shakeLiveFiles=shakeLiveFiles s ++ [fromMaybe "live.txt" x]}) "List the files that are live [to live.txt]." ,opts $ Option "m" ["metadata"] (reqArg "PREFIX" $ \x s -> s{shakeFiles=x}) "Prefix for storing metadata files." ,extr $ Option "" ["numeric-version"] (noArg [NumericVersion]) "Print just the version number and exit." ,opts $ Option "" ["skip-commands"] (noArg $ \s -> s{shakeRunCommands=False}) "Try and avoid running external programs." ,opts $ Option "B" ["rebuild"] (optArg "PATTERN" $ \x s -> s{shakeRebuild=shakeRebuild s ++ [(RebuildNow, fromMaybe "**" x)]}) "If required, these files will rebuild even if nothing has changed." ,opts $ Option "" ["no-rebuild"] (optArg "PATTERN" $ \x s -> s{shakeRebuild=shakeRebuild s ++ [(RebuildNormal, fromMaybe "**" x)]}) "If required, these files will rebuild only if things have changed (default)." ,opts $ Option "" ["skip"] (optArg "PATTERN" $ \x s -> s{shakeRebuild=shakeRebuild s ++ [(RebuildLater, fromMaybe "**" x)]}) "Don't rebuild matching files this run." -- ,yes $ Option "" ["skip-forever"] (OptArg (\x -> Right ([], \s -> s{shakeRebuild=shakeRebuild s ++ [(RebuildNever, fromMaybe "**" x)]})) "PATTERN") "Don't rebuild matching files until they change." ,opts $ Option "r" ["report","profile"] (optArg "FILE" $ \x s -> s{shakeReport=shakeReport s ++ [fromMaybe "report.html" x]}) "Write out profiling information [to report.html]." ,opts $ Option "" ["no-reports"] (noArg $ \s -> s{shakeReport=[]}) "Turn off --report." ,opts $ Option "" ["rule-version"] (reqArg "VERSION" $ \x s -> s{shakeVersion=x}) "Version of the build rules." ,opts $ Option "" ["no-rule-version"] (noArg $ \s -> s{shakeVersionIgnore=True}) "Ignore the build rules version." ,opts $ Option "" ["share"] (optArg "DIRECTORY" $ \x s -> s{shakeShare=Just $ fromMaybe "" x, shakeChange=ensureHash $ shakeChange s}) "Shared cache location." ,hide $ Option "" ["share-list"] (noArg ([ShareList], ensureShare)) "List the shared cache files." ,hide $ Option "" ["share-sanity"] (noArg ([ShareSanity], ensureShare)) "Sanity check the shared cache files." ,hide $ Option "" ["share-remove"] (OptArg (\x -> Right ([ShareRemove $ fromMaybe "**" x], ensureShare)) "SUBSTRING") "Remove the shared cache keys." ,opts $ Option "" ["share-copy"] (noArg $ \s -> s{shakeSymlink=False}) "Copy files into the cache." ,opts $ Option "" ["share-symlink"] (noArg $ \s -> s{shakeSymlink=True}) "Symlink files into the cache." ,opts $ Option "s" ["silent"] (noArg $ \s -> s{shakeVerbosity=Silent}) "Don't print anything." ,extr $ Option "" ["sleep"] (noArg [Sleep]) "Sleep for a second before building." ,opts $ Option "S" ["no-keep-going","stop"] (noArg $ \s -> s{shakeStaunch=False}) "Turns off -k." ,opts $ Option "" ["storage"] (noArg $ \s -> s{shakeStorageLog=True}) "Write a storage log." ,both $ Option "p" ["progress"] (progress $ optArgInt 1 "progress" "N" $ \i s -> s{shakeProgress=prog $ fromMaybe 5 i}) "Show progress messages [every N secs, default 5]." ,opts $ Option "" ["no-progress"] (noArg $ \s -> s{shakeProgress=const $ pure ()}) "Don't show progress messages." ,opts $ Option "q" ["quiet"] (noArg $ \s -> s{shakeVerbosity=move (shakeVerbosity s) pred}) "Print less (pass repeatedly for even less)." ,extr $ Option "" ["no-time"] (noArg [NoTime]) "Don't print build time." ,opts $ Option "" ["timings"] (noArg $ \s -> s{shakeTimings=True}) "Print phase timings." ,opts $ Option "V" ["verbose","trace"] (noArg $ \s -> s{shakeVerbosity=move (shakeVerbosity s) succ}) "Print more (pass repeatedly for even more)." ,extr $ Option "v" ["version"] (noArg [Version]) "Print the version number and exit." ,extr $ Option "w" ["print-directory"] (noArg [PrintDirectory True]) "Print the current directory." ,extr $ Option "" ["no-print-directory"] (noArg [PrintDirectory False]) "Turn off -w, even if it was turned on implicitly." ] where opts o = (True, fmapFmapOptDescr ([],) o) extr o = (False, fmapFmapOptDescr (,id) o) both o = (True, o) hide o = (False, o) -- I do modify the options, but not in a meaningful way move :: Verbosity -> (Int -> Int) -> Verbosity move x by = toEnum $ min (fromEnum mx) $ max (fromEnum mn) $ by $ fromEnum x where (mn,mx) = (asTypeOf minBound x, asTypeOf maxBound x) noArg = NoArg . Right reqArg a f = ReqArg (Right . f) a optArg a f = OptArg (Right . f) a reqIntArg mn flag a f = flip ReqArg a $ \x -> case reads x of [(i,"")] | i >= mn -> Right (f i) _ -> Left $ "the `--" ++ flag ++ "' option requires a number, " ++ show mn ++ " or above" optArgInt mn flag a f = flip OptArg a $ maybe (Right (f Nothing)) $ \x -> case reads x of [(i,"")] | i >= mn -> Right (f $ Just i) _ -> Left $ "the `--" ++ flag ++ "' option requires a number, " ++ show mn ++ " or above" optArgAuto flag a f = flip OptArg a $ maybe (Right (f Yes)) $ \x -> case x of "yes" -> Right $ f Yes "no" -> Right $ f No "auto" -> Right $ f Auto _ -> Left $ "the `--" ++ flag ++ "' option requires yes|no|auto, but got " ++ show x reqArgPair flag a f = flip ReqArg a $ \x -> case break (== '=') x of (a,'=':b) -> Right $ f (a,b) _ -> Left $ "the `--" ++ flag ++ "' option requires an = in the argument" progress (OptArg func msg) = flip OptArg msg $ \x -> case break (== '=') `fmap` x of Just ("record",file) -> Right ([ProgressRecord $ if null file then "progress.txt" else tailErr file], id) Just ("replay",file) -> Right ([ProgressReplay $ if null file then "progress.txt" else tailErr file], id) _ -> ([],) <$> func x progress _ = throwImpure $ errorInternal "incomplete pattern, progress" outputDebug output Nothing = output outputDebug output (Just file) = \v msg -> do when (v /= Diagnostic) $ output v msg appendFile file $ removeEscCodes msg ++ "\n" prog i p = do program <- progressProgram progressDisplay i (\s -> progressTitlebar s >> program s) p -- ensure the file system always computes a hash, required for --share ensureHash ChangeModtime = ChangeModtimeAndDigest ensureHash ChangeModtimeAndDigestInput = ChangeModtimeAndDigest ensureHash x = x ensureShare s = s{shakeShare = Just $ fromMaybe "." $ shakeShare s} shake-0.19.8/src/Development/Shake/Internal/CmdOption.hs0000644000000000000000000000703007346545000021205 0ustar0000000000000000{-# LANGUAGE DeriveDataTypeable #-} module Development.Shake.Internal.CmdOption(CmdOption(..)) where import Data.Data import qualified Data.ByteString.Lazy.Char8 as LBS -- | Options passed to 'command' or 'cmd' to control how processes are executed. data CmdOption = Cwd FilePath -- ^ Change the current directory in the spawned process. By default uses this processes current directory. -- Successive 'Cwd' options are joined together, to change into nested directories. | Env [(String,String)] -- ^ Change the environment variables in the spawned process. By default uses this processes environment. | AddEnv String String -- ^ Add an environment variable in the child process. | RemEnv String -- ^ Remove an environment variable from the child process. | AddPath [String] [String] -- ^ Add some items to the prefix and suffix of the @$PATH@ variable. | Stdin String -- ^ Given as the @stdin@ of the spawned process. | StdinBS LBS.ByteString -- ^ Given as the @stdin@ of the spawned process. | FileStdin FilePath -- ^ Take the @stdin@ from a file. | Shell -- ^ Pass the command to the shell without escaping - any arguments will be joined with spaces. By default arguments are escaped properly. | BinaryPipes -- ^ Treat the @stdin@\/@stdout@\/@stderr@ messages as binary. By default 'String' results use text encoding and 'ByteString' results use binary encoding. | Traced String -- ^ Name to use with 'traced', or @\"\"@ for no tracing. By default traces using the name of the executable. | Timeout Double -- ^ Abort the computation after N seconds, will raise a failure exit code. Calls 'interruptProcessGroupOf' and 'terminateProcess', but may sometimes fail to abort the process and not timeout. | WithStdout Bool -- ^ Should I include the @stdout@ in the exception if the command fails? Defaults to 'False'. | WithStderr Bool -- ^ Should I include the @stderr@ in the exception if the command fails? Defaults to 'True'. | EchoStdout Bool -- ^ Should I echo the @stdout@? Defaults to 'True' unless a 'Stdout' result is required or you use 'FileStdout'. | EchoStderr Bool -- ^ Should I echo the @stderr@? Defaults to 'True' unless a 'Stderr' result is required or you use 'FileStderr'. | FileStdout FilePath -- ^ Should I put the @stdout@ to a file. | FileStderr FilePath -- ^ Should I put the @stderr@ to a file. | AutoDeps -- ^ Compute dependencies automatically. Only works if 'shakeLintInside' has been set to the files where autodeps might live. | UserCommand String -- ^ The command the user thinks about, before any munging. Defaults to the actual command. | FSAOptions String -- ^ Options to @fsatrace@, a list of strings with characters such as @\"r\"@ (reads) @\"w\"@ (writes). Defaults to @\"rwmdqt\"@ if the output of @fsatrace@ is required. | CloseFileHandles -- ^ Before starting the command in the child process, close all file handles except stdin, stdout, stderr in the child process. Uses @close_fds@ from package process and comes with the same caveats, i.e. runtime is linear with the maximum number of open file handles (@RLIMIT_NOFILE@, see @man 2 getrlimit@ on Linux). | NoProcessGroup -- ^ Don't run the process in its own group. Required when running @docker@. Will mean that process timeouts and asyncronous exceptions may not properly clean up child processes. | InheritStdin -- ^ Cause the stdin from the parent to be inherited. Might also require NoProcessGroup on Linux. Ignored if you explicitly pass a stdin. deriving (Eq,Ord,Show,Data,Typeable) shake-0.19.8/src/Development/Shake/Internal/CompactUI.hs0000644000000000000000000000541007346545000021135 0ustar0000000000000000 -- | Provide a Buck/Bazel style UI. module Development.Shake.Internal.CompactUI( compactUI ) where import Development.Shake.Internal.CmdOption import Development.Shake.Internal.Options import Development.Shake.Internal.Progress import System.Time.Extra import General.Extra import Control.Exception import General.Thread import General.EscCodes import Data.IORef.Extra import Control.Monad.Extra data S = S {sOutput :: [String] -- ^ Messages that haven't yet been printed, in reverse. ,sProgress :: String -- ^ Last progress message. ,sTraces :: [Maybe (String, String, Seconds)] -- ^ the traced items, in the order we display them ,sUnwind :: Int -- ^ Number of lines we used last time around } emptyS = S [] "Starting..." [] 0 addOutput pri msg s = s{sOutput = msg : sOutput s} addProgress x s = s{sProgress = x} addTrace key msg start time s | start = s{sTraces = insert (key,msg,time) $ sTraces s} | otherwise = s{sTraces = remove (\(a,b,_) -> a == key && b == msg) $ sTraces s} where insert v (Nothing:xs) = Just v:xs insert v (x:xs) = x : insert v xs insert v [] = [Just v] remove f (Just x:xs) | f x = Nothing:xs remove f (x:xs) = x : remove f xs remove f [] = [] display :: Seconds -> S -> (S, String) display time s = (s{sOutput=[], sUnwind=length post}, escCursorUp (sUnwind s) ++ unlines (map pad $ pre ++ post)) where pre = sOutput s post = "" : (escForeground Green ++ "Status: " ++ sProgress s ++ escNormal) : map f (sTraces s) pad x = x ++ escClearLine f Nothing = " *" f (Just (k,m,t)) = " * " ++ k ++ " (" ++ g (time - t) m ++ ")" g i m | showDurationSecs i == "0s" = m | i < 10 = s | otherwise = escForeground (if i > 20 then Red else Yellow) ++ s ++ escNormal where s = m ++ " " ++ showDurationSecs i -- | Run a compact UI, with the ShakeOptions modifier, combined with compactUI :: ShakeOptions -> IO (ShakeOptions, IO ()) compactUI opts = do unlessM checkEscCodes $ putStrLn "Your terminal does not appear to support escape codes, --compact mode may not work" ref <- newIORef emptyS let tweak = atomicModifyIORef_ ref time <- offsetTime opts <- pure $ opts {shakeTrace = \a b c -> do t <- time; tweak (addTrace a b c t) ,shakeOutput = \a b -> tweak (addOutput a b) ,shakeProgress = \x -> void $ progressDisplay 1 (tweak . addProgress) x `withThreadsBoth` shakeProgress opts x ,shakeCommandOptions = [EchoStdout False, EchoStderr False] ++ shakeCommandOptions opts ,shakeVerbosity = Error } let tick = do t <- time; mask_ $ putStr =<< atomicModifyIORef ref (display t) pure (opts, forever (tick >> sleep 0.4) `finally` tick) shake-0.19.8/src/Development/Shake/Internal/Core/0000755000000000000000000000000007346545000017645 5ustar0000000000000000shake-0.19.8/src/Development/Shake/Internal/Core/Action.hs0000644000000000000000000006647407346545000021437 0ustar0000000000000000{-# LANGUAGE RecordWildCards, NamedFieldPuns, ScopedTypeVariables, ConstraintKinds, TupleSections #-} module Development.Shake.Internal.Core.Action( actionOnException, actionFinally, actionBracket, actionCatch, actionRetry, getShakeOptions, getProgress, runAfter, lintTrackRead, lintTrackWrite, lintTrackAllow, getVerbosity, putWhen, putVerbose, putInfo, putWarn, putError, withVerbosity, quietly, orderOnlyAction, newCacheIO, unsafeExtraThread, parallel, batch, reschedule, historyDisable, traced, -- Internal only producesChecked, producesUnchecked, producesCheck, lintCurrentDirectory, lintWatch, blockApply, unsafeAllowApply, shakeException, lintTrackFinished, getCurrentKey, getLocal, actionShareList, actionShareRemove, actionShareSanity ) where import Control.Exception import Control.Monad.Extra import Control.Monad.IO.Class import Control.DeepSeq import Data.Typeable import System.Directory import System.FilePattern import System.FilePattern.Directory import System.Time.Extra import Control.Concurrent.Extra import Data.Maybe import Data.Tuple.Extra import Data.IORef.Extra import Data.List.Extra import Numeric.Extra import General.Extra import qualified Data.HashMap.Strict as Map import qualified Data.HashSet as Set import Development.Shake.Classes import Development.Shake.Internal.Core.Monad import Development.Shake.Internal.Core.Database import Development.Shake.Internal.History.Shared import General.Pool import Development.Shake.Internal.Core.Types import Development.Shake.Internal.Core.Rules import Development.Shake.Internal.Core.Pool import Development.Shake.Internal.Value import Development.Shake.Internal.FileInfo import Development.Shake.Internal.FileName import Development.Shake.Internal.Options import Development.Shake.Internal.Errors import General.Cleanup import General.Fence --------------------------------------------------------------------- -- RAW WRAPPERS -- | Apply a modification, run an action, then run an undo action after. -- Doesn't actually require exception handling because we don't have the ability to catch exceptions to the user. actionThenUndoLocal :: (Local -> (Local, Local -> Local)) -> Action a -> Action a actionThenUndoLocal f m = Action $ do s <- getRW let (s2,undo) = f s putRW s2 res <- fromAction m modifyRW undo pure res --------------------------------------------------------------------- -- EXCEPTION HANDLING -- | Turn a normal exception into a ShakeException, giving it a stack and printing it out if in staunch mode. -- If the exception is already a ShakeException (e.g. it's a child of ours who failed and we are rethrowing) -- then do nothing with it. shakeException :: Global -> Stack -> SomeException -> IO ShakeException shakeException Global{globalOptions=ShakeOptions{..},..} stk e = case fromException e of Just (e :: ShakeException) -> pure e Nothing -> do e<- pure $ exceptionStack stk e when (shakeStaunch && shakeVerbosity >= Error) $ globalOutput Error $ show e ++ "Continuing due to staunch mode" pure e actionBracketEx :: Bool -> IO a -> (a -> IO b) -> (a -> Action c) -> Action c actionBracketEx runOnSuccess alloc free act = do Global{..} <- Action getRO (v, key) <- liftIO $ mask_ $ do v <- alloc key <- liftIO $ register globalCleanup $ void $ free v pure (v, key) res <- Action $ catchRAW (fromAction $ act v) $ \e -> liftIO (release key) >> throwRAW e liftIO $ if runOnSuccess then release key else unprotect key pure res -- | If an exception is raised by the 'Action', perform some 'IO' then reraise the exception. -- This function is implemented using 'actionBracket'. actionOnException :: Action a -> IO b -> Action a actionOnException act free = actionBracketEx False (pure ()) (const free) (const act) -- | After an 'Action', perform some 'IO', even if there is an exception. -- This function is implemented using 'actionBracket'. actionFinally :: Action a -> IO b -> Action a actionFinally act free = actionBracket (pure ()) (const free) (const act) -- | Like 'bracket', but where the inner operation is of type 'Action'. Usually used as -- @'actionBracket' alloc free use@. -- -- The @free@ action will be run masked. The cost of 'actionBracket' is _O(n log n)_ -- in the number of simultaneous 'actionBracket' calls active in the program. actionBracket :: IO a -> (a -> IO b) -> (a -> Action c) -> Action c actionBracket = actionBracketEx True -- | If a syncronous exception is raised by the 'Action', perform some handler. -- Note that there is no guarantee that the handler will run on shutdown (use 'actionFinally' for that), -- and that 'actionCatch' /cannot/ catch exceptions thrown by dependencies, e.g. raised by 'need' -- (to do so would allow untracked dependencies on failure conditions). actionCatch :: Exception e => Action a -> (e -> Action a) -> Action a actionCatch act hdl = Action $ catchRAW (fromAction act) $ \e -> case () of _ | not $ isAsyncException e , Nothing <- fromException e :: Maybe ShakeException , Just e <- fromException e -> fromAction $ hdl e _ -> throwRAW e -- | Retry an 'Action' if it throws an exception, at most /n/ times (where /n/ must be positive). -- If you need to call this function, you should probably try and fix the underlying cause (but you also probably know that). actionRetry :: Int -> Action a -> Action a actionRetry i act | i <= 0 = fail $ "actionRetry first argument must be positive, got " ++ show i | i == 1 = act | otherwise = Action $ catchRAW (fromAction act) $ \_ -> fromAction $ actionRetry (i-1) act --------------------------------------------------------------------- -- QUERIES -- | Get the initial 'ShakeOptions', these will not change during the build process. getShakeOptions :: Action ShakeOptions getShakeOptions = Action $ globalOptions <$> getRO -- | Get the current 'Progress' structure, as would be returned by 'shakeProgress'. getProgress :: Action Progress getProgress = do Global{..} <- Action getRO liftIO globalProgress -- | Specify an action to be run after the database has been closed, if building completes successfully. runAfter :: IO () -> Action () runAfter op = do Global{..} <- Action getRO liftIO $ atomicModifyIORef_ globalAfter (op:) --------------------------------------------------------------------- -- VERBOSITY putWhen :: Verbosity -> String -> Action () putWhen v msg = do Global{..} <- Action getRO verb <- getVerbosity when (verb >= v) $ liftIO $ globalOutput v msg -- | Write an unimportant message to the output, only shown when 'shakeVerbosity' is higher than normal ('Verbose' or above). -- The output will not be interleaved with any other Shake messages (other than those generated by system commands). putVerbose :: String -> Action () putVerbose = putWhen Verbose -- | Write a normal priority message to the output, only suppressed when 'shakeVerbosity' is 'Error', 'Warn' or 'Silent'. -- The output will not be interleaved with any other Shake messages (other than those generated by system commands). putInfo :: String -> Action () putInfo = putWhen Info -- | Write a semi important message to the output, only suppressed when 'shakeVerbosity' is 'Error' or 'Silent'. -- The output will not be interleaved with any other Shake messages (other than those generated by system commands). putWarn :: String -> Action () putWarn = putWhen Warn -- | Write an important message to the output, only suppressed when 'shakeVerbosity' is 'Silent'. -- The output will not be interleaved with any other Shake messages (other than those generated by system commands). putError :: String -> Action () putError = putWhen Error -- | Get the current verbosity level, originally set by 'shakeVerbosity'. If you -- want to output information to the console, you are recommended to use -- 'putVerbose' \/ 'putInfo' \/ 'putError', which ensures multiple messages are -- not interleaved. The verbosity can be modified locally by 'withVerbosity'. getVerbosity :: Action Verbosity getVerbosity = Action $ localVerbosity <$> getRW -- | Run an action with a particular verbosity level. -- Will not update the 'shakeVerbosity' returned by 'getShakeOptions' and will -- not have any impact on 'Diagnostic' tracing. withVerbosity :: Verbosity -> Action a -> Action a withVerbosity new = actionThenUndoLocal $ \s0 -> (s0{localVerbosity=new}, \s -> s{localVerbosity=localVerbosity s0}) -- | Run an action with 'Error' verbosity, in particular messages produced by 'traced' -- (including from 'Development.Shake.cmd' or 'Development.Shake.command') will not be printed to the screen. -- Will not update the 'shakeVerbosity' returned by 'getShakeOptions' and will -- not turn off any 'Diagnostic' tracing. quietly :: Action a -> Action a quietly = withVerbosity Error --------------------------------------------------------------------- -- BLOCK APPLY unsafeAllowApply :: Action a -> Action a unsafeAllowApply = applyBlockedBy Nothing blockApply :: String -> Action a -> Action a blockApply = applyBlockedBy . Just applyBlockedBy :: Maybe String -> Action a -> Action a applyBlockedBy reason = actionThenUndoLocal $ \s0 -> (s0{localBlockApply=reason}, \s -> s{localBlockApply=localBlockApply s0}) --------------------------------------------------------------------- -- TRACING -- | Write an action to the trace list, along with the start/end time of running the IO action. -- The 'Development.Shake.cmd' and 'Development.Shake.command' functions automatically call 'traced' -- with the name of the executable. The trace list is used for profile reports (see 'shakeReport'). -- -- By default 'traced' prints some useful extra context about what -- Shake is building, e.g.: -- -- > # traced message (for myobject.o) -- -- To suppress the output of 'traced' (for example you want more control -- over the message using 'putInfo'), use the 'quietly' combinator. -- -- It is recommended that the string passed to 'traced' is short and that only a small number of unique strings -- are used (makes profiling work better). -- The string does not need to make sense on its own, only in conjunction with the target it is building. traced :: String -> IO a -> Action a traced msg act = do Global{..} <- Action getRO Local{localStack} <- Action getRW start <- liftIO globalTimestamp let key = showTopStack localStack putInfo $ "# " ++ msg ++ " (for " ++ key ++ ")" res <- liftIO $ (shakeTrace globalOptions key msg True >> act) `finally` shakeTrace globalOptions key msg False stop <- liftIO globalTimestamp let trace = newTrace msg start stop liftIO $ evaluate $ rnf trace Action $ modifyRW $ \s -> s{localTraces = addTrace (localTraces s) trace} pure res --------------------------------------------------------------------- -- TRACKING -- | Track that a key has been used/read by the action preceding it when 'shakeLint' is active. lintTrackRead :: ShakeValue key => [key] -> Action () -- One of the following must be true: -- 1) you are the one building this key (e.g. key == topStack) -- 2) you have already been used by apply, and are on the dependency list -- 3) someone explicitly gave you permission with trackAllow -- 4) at the end of the rule, a) you are now on the dependency list, and b) this key itself has no dependencies (is source file) lintTrackRead ks = do Global{..} <- Action getRO when (isJust $ shakeLint globalOptions) $ do l@Local{..} <- Action getRW deps <- liftIO $ concatMapM (listDepends globalDatabase) $ enumerateDepends localDepends let top = topStack localStack let condition1 k = top == Just k let condition2 k = k `elem` deps let condition3 k = any ($ k) localTrackAllows let condition4 = filter (\k -> not $ condition1 k || condition2 k || condition3 k) $ map newKey ks unless (null condition4) $ Action $ putRW l{localTrackRead = condition4 ++ localTrackRead} -- | Track that a key has been changed/written by the action preceding it when 'shakeLint' is active. lintTrackWrite :: ShakeValue key => [key] -> Action () -- One of the following must be true: -- 1) you are the one building this key (e.g. key == topStack) -- 2) someone explicitly gave you permission with trackAllow -- 3) this file is never known to the build system, at the end it is not in the database lintTrackWrite ks = do Global{..} <- Action getRO when (isJust $ shakeLint globalOptions) $ do l@Local{..} <- Action getRW let top = topStack localStack let condition1 k = Just k == top let condition2 k = any ($ k) localTrackAllows let condition3 = filter (\k -> not $ condition1 k || condition2 k) $ map newKey ks unless (null condition3) $ Action $ putRW l{localTrackWrite = condition3 ++ localTrackWrite} lintTrackFinished :: Action () lintTrackFinished = do -- only called when isJust shakeLint Global{..} <- Action getRO Local{..} <- Action getRW liftIO $ do let top = topStack localStack -- must apply the ignore at the end, because we might have merged in more ignores that -- apply to other branches let ignore k = any ($ k) localTrackAllows -- Read stuff deps <- concatMapM (listDepends globalDatabase) $ enumerateDepends localDepends let used = Set.filter (not . ignore) $ Set.fromList localTrackRead -- check Read 4a bad<- pure $ Set.toList $ used `Set.difference` Set.fromList deps unless (null bad) $ do let n = length bad throwM $ errorStructured ("Lint checking error - " ++ (if n == 1 then "value was" else show n ++ " values were") ++ " used but not depended upon") [("Used", Just $ show x) | x <- bad] "" -- check Read 4b bad <- flip filterM (Set.toList used) $ \k -> not . null <$> lookupDependencies globalDatabase k unless (null bad) $ do let n = length bad throwM $ errorStructured ("Lint checking error - " ++ (if n == 1 then "value was" else show n ++ " values were") ++ " depended upon after being used") [("Used", Just $ show x) | x <- bad] "" -- check Write 3 bad<- pure $ filter (not . ignore) $ Set.toList $ Set.fromList localTrackWrite unless (null bad) $ liftIO $ atomicModifyIORef_ globalTrackAbsent ([(fromMaybe k top, k) | k <- bad] ++) -- | Allow any matching key recorded with 'lintTrackRead' or 'lintTrackWrite' in this action, -- after this call, to violate the tracking rules. lintTrackAllow :: ShakeValue key => (key -> Bool) -> Action () lintTrackAllow (test :: key -> Bool) = do Global{..} <- Action getRO when (isJust $ shakeLint globalOptions) $ Action $ modifyRW $ \s -> s{localTrackAllows = f : localTrackAllows s} where tk = typeRep (Proxy :: Proxy key) f k = typeKey k == tk && test (fromKey k) lintCurrentDirectory :: FilePath -> String -> IO () lintCurrentDirectory old msg = do now <- getCurrentDirectory when (old /= now) $ throwIO $ errorStructured "Lint checking error - current directory has changed" [("When", Just msg) ,("Wanted",Just old) ,("Got",Just now)] "" lintWatch :: [FilePattern] -> IO (String -> IO ()) lintWatch [] = pure $ const $ pure () lintWatch pats = do let op = getDirectoryFiles "." pats -- cache parsing of the pats let record = do xs <- op; forM xs $ \x -> (x,) <$> getFileInfo False (fileNameFromString x) old <- record pure $ \msg -> do now <- record when (old /= now) $ throwIO $ errorStructured "Lint checking error - watched files have changed" (("When", Just msg) : changes (Map.fromList old) (Map.fromList now)) "" where changes old now = [("Created", Just x) | x <- Map.keys $ Map.difference now old] ++ [("Deleted", Just x) | x <- Map.keys $ Map.difference old now] ++ [("Changed", Just x) | x <- Map.keys $ Map.filter id $ Map.intersectionWith (/=) old now] listDepends :: Database -> Depends -> IO [Key] listDepends db (Depends xs) = mapM (fmap (fst . fromJust) . getKeyValueFromId db) xs lookupDependencies :: Database -> Key -> IO [Depends] lookupDependencies db k = do Just (Ready r) <- getValueFromKey db k pure $ depends r -- | This rule should not be saved to shared/cloud storage via 'shakeShare'. -- There are usually two reasons to call this function: -- -- 1. It makes use of untracked dependencies that are specific to this machine, -- e.g. files in a system directory or items on the @$PATH@. -- 2. The rule is trivial to compute locally, so there is no point sharing it. -- -- If you want the rule to not be cached at all, use 'alwaysRerun'. historyDisable :: Action () historyDisable = Action $ modifyRW $ \s -> s{localHistory = False} -- | A version of 'produces' that checks the files actually exist producesChecked :: [FilePath] -> Action () producesChecked xs = Action $ modifyRW $ \s -> s{localProduces = map (True,) (reverse xs) ++ localProduces s} -- | A version of 'produces' that does not check. producesUnchecked :: [FilePath] -> Action () producesUnchecked xs = Action $ modifyRW $ \s -> s{localProduces = map (False,) (reverse xs) ++ localProduces s} producesCheck :: Action () producesCheck = do Local{localProduces} <- Action getRW missing <- liftIO $ filterM (notM . doesFileExist_) $ map snd $ filter fst localProduces when (missing /= []) $ throwM $ errorStructured "Files declared by 'produces' not produced" [("File " ++ show i, Just x) | (i,x) <- zipFrom 1 missing] "" -- | Run an action but do not depend on anything the action uses. -- A more general version of 'orderOnly'. orderOnlyAction :: Action a -> Action a orderOnlyAction act = Action $ do Local{localDepends=pre} <- getRW res <- fromAction act modifyRW $ \s -> s{localDepends=pre} pure res --------------------------------------------------------------------- -- MORE COMPLEX -- | A version of 'Development.Shake.newCache' that runs in IO, and can be called before calling 'Development.Shake.shake'. -- Most people should use 'Development.Shake.newCache' instead. newCacheIO :: (Eq k, Hashable k) => (k -> Action v) -> IO (k -> Action v) newCacheIO (act :: k -> Action v) = do var :: Var (Map.HashMap k (Fence IO (Either SomeException (DependsList,v)))) <- newVar Map.empty pure $ \key -> join $ liftIO $ modifyVar var $ \mp -> case Map.lookup key mp of Just bar -> pure $ (,) mp $ do (offset, (deps, v)) <- actionFenceRequeue bar Action $ modifyRW $ \s -> addDiscount offset $ s{localDepends = addDepends (localDepends s) deps} pure v Nothing -> do bar <- newFence pure $ (Map.insert key bar mp,) $ do Local{localDepends=pre} <- Action getRW Action $ modifyRW $ \s -> s{localDepends = newDepends []} res <- Action $ tryRAW $ fromAction $ act key case res of Left err -> do liftIO $ signalFence bar $ Left err Action $ throwRAW err Right v -> do Local{localDepends=deps} <- Action getRW Action $ modifyRW $ \s -> s{localDepends = addDepends pre deps} liftIO $ signalFence bar $ Right (deps, v) pure v -- | Run an action without counting to the thread limit, typically used for actions that execute -- on remote machines using barely any local CPU resources. -- Unsafe as it allows the 'shakeThreads' limit to be exceeded. -- You cannot depend on a rule (e.g. 'need') while the extra thread is executing. -- If the rule blocks (e.g. calls 'withResource') then the extra thread may be used by some other action. -- Only really suitable for calling 'cmd' / 'command'. unsafeExtraThread :: Action a -> Action a unsafeExtraThread act = do Global{..} <- Action getRO stop <- liftIO $ increasePool globalPool res <- Action $ tryRAW $ fromAction $ blockApply "Within unsafeExtraThread" act liftIO stop -- we start a new thread, giving up ours, to ensure the thread count goes down (wait, res) <- actionAlwaysRequeue res Action $ modifyRW $ addDiscount wait pure res -- | Execute a list of actions in parallel. In most cases 'need' will be more appropriate to benefit from parallelism. -- If the two types of 'Action' are different dependencies which ultimately boil down to 'apply', -- using 'Applicative' operations will still ensure the dependencies occur in parallel. -- The main use of this function is to run work that happens in an 'Action' in parallel. parallel :: [Action a] -> Action [a] -- Note: There is no parallel_ unlike sequence_ because there is no stack benefit to doing so parallel [] = pure [] parallel [x] = pure <$> x parallel acts = do Global{..} <- Action getRO done <- liftIO $ newIORef False waits <- forM acts $ \act -> addPoolWait PoolResume $ do whenM (liftIO $ readIORef done) $ fail "parallel, one has already failed" Action $ modifyRW localClearMutable res <- act old <- Action getRW pure (old, res) (wait, res) <- actionFenceSteal =<< liftIO (exceptFence waits) liftIO $ atomicWriteIORef done True let (waits, locals, results) = unzip3 $ map (\(a,(b,c)) -> (a,b,c)) res Action $ modifyRW $ \root -> addDiscount (wait - sum waits) $ localMergeMutable root locals pure results -- | Batch different outputs into a single 'Action', typically useful when a command has a high -- startup cost - e.g. @apt-get install foo bar baz@ is a lot cheaper than three separate -- calls to @apt-get install@. As an example, if we have a standard build rule: -- -- @ -- \"*.out\" 'Development.Shake.%>' \\out -> do -- 'Development.Shake.need' [out '-<.>' \"in\"] -- 'Development.Shake.cmd' "build-multiple" [out '-<.>' \"in\"] -- @ -- -- Assuming that @build-multiple@ can compile multiple files in a single run, -- and that the cost of doing so is a lot less than running each individually, -- we can write: -- -- @ -- 'batch' 3 (\"*.out\" 'Development.Shake.%>') -- (\\out -> do 'Development.Shake.need' [out '-<.>' \"in\"]; pure out) -- (\\outs -> 'Development.Shake.cmd' "build-multiple" [out '-<.>' \"in\" | out \<- outs]) -- @ -- -- In constrast to the normal call, we have specified a maximum batch size of 3, -- an action to run on each output individually (typically all the 'need' dependencies), -- and an action that runs on multiple files at once. If we were to require lots of -- @*.out@ files, they would typically be built in batches of 3. -- -- If Shake ever has nothing else to do it will run batches before they are at the maximum, -- so you may see much smaller batches, especially at high parallelism settings. batch :: Int -- ^ Maximum number to run in a single batch, e.g. @3@, must be positive. -> ((a -> Action ()) -> Rules ()) -- ^ Way to match an entry, e.g. @\"*.ext\" '%>'@. -> (a -> Action b) -- ^ Preparation to run individually on each, e.g. using 'need'. -> ([b] -> Action ()) -- ^ Combination action to run on all, e.g. using 'cmd'. -> Rules () batch mx pred one many | mx <= 0 = error $ "Can't call batchable with <= 0, you used " ++ show mx | mx == 1 = pred $ \a -> do b <- one a; many [b] | otherwise = do todo :: IORef (Int, [(b, Local, Fence IO (Either SomeException (Seconds, Local)))]) <- liftIO $ newIORef (0, []) pred $ \a -> do b <- one a fence <- liftIO newFence -- add one to the batch local <- Action getRW count <- liftIO $ atomicModifyIORef todo $ \(count, bs) -> let i = count+1 in ((i, (b,local,fence):bs), i) requeue todo (==) count (wait, (cost, local2)) <- actionFenceRequeue fence Action $ modifyRW $ \root -> addDiscount (wait - cost) $ localMergeMutable root [local2] where -- When changing by one, only trigger on (==) so we don't have lots of waiting pool entries -- When changing by many, trigger on (>=) because we don't hit all edges requeue todo trigger count | count `trigger` mx = addPoolWait_ PoolResume $ go todo | count `trigger` 1 = addPoolWait_ PoolBatch $ go todo | otherwise = pure () go todo = do -- delete at most mx from the batch (now, count) <- liftIO $ atomicModifyIORef todo $ \(count, bs) -> let (now,later) = splitAt mx bs count2 = if count > mx then count - mx else 0 in ((count2, later), (now, count2)) requeue todo (>=) count unless (null now) $ do res <- Action $ tryRAW $ do -- make sure we are using one of the local's that we are computing -- we things like stack, blockApply etc. work as expected modifyRW $ const $ localClearMutable $ snd3 $ headErr now start <- liftIO offsetTime fromAction $ many $ map fst3 now end <- liftIO start -- accounting for time is tricky, we spend time T, over N jobs -- so want to charge everyone for T / N time -- but that also means we need to subtract localDiscount so we don't apply that to all rw <- getRW let t = end - localDiscount rw let n = intToDouble (length now) pure (t / n, rw{localDiscount = 0}) liftIO $ mapM_ (flip signalFence res . thd3) now -- | Given a running task, reschedule so it only continues after all other pending tasks, -- and all rescheduled tasks with a higher pool priority. Note that due to parallelism there is no guarantee -- that all actions of a higher pool priority will have /completed/ before the action resumes. -- Only useful if the results are being interactively reported or consumed. reschedule :: Double -> Action () reschedule x = do (wait, _) <- actionAlwaysRequeuePriority (PoolDeprioritize $ negate x) $ pure () Action $ modifyRW $ addDiscount wait getCurrentKey :: Action (Maybe Key) getCurrentKey = Action $ topStack . localStack <$> getRW getLocal :: Action Local getLocal = Action getRW -- | Hooked up to --share-remove actionShareRemove :: [String] -> Action () actionShareRemove substrs = do Global{..} <- Action getRO case globalShared of Nothing -> throwM $ errorInternal "actionShareRemove with no shared" Just x -> liftIO $ removeShared x $ \k -> any (`isInfixOf` show k) substrs -- | Hooked up to --share-list actionShareList :: Action () actionShareList = do Global{..} <- Action getRO case globalShared of Nothing -> throwM $ errorInternal "actionShareList with no shared" Just x -> liftIO $ listShared x -- | Hooked up to --share-sanity actionShareSanity :: Action () actionShareSanity = do Global{..} <- Action getRO case globalShared of Nothing -> throwM $ errorInternal "actionShareSanity with no shared" Just x -> liftIO $ sanityShared x shake-0.19.8/src/Development/Shake/Internal/Core/Build.hs0000644000000000000000000004014007346545000021237 0ustar0000000000000000{-# LANGUAGE LambdaCase #-} {-# LANGUAGE RecordWildCards, PatternGuards, ScopedTypeVariables, NamedFieldPuns, GADTs #-} {-# LANGUAGE Rank2Types, ConstraintKinds, TypeOperators, TupleSections, ViewPatterns #-} module Development.Shake.Internal.Core.Build( getDatabaseValue, getDatabaseValueGeneric, historyIsEnabled, historySave, historyLoad, applyKeyValue, apply, apply1, ) where import Development.Shake.Classes import General.Pool import Development.Shake.Internal.Core.Database import Development.Shake.Internal.Value import Development.Shake.Internal.Errors import Development.Shake.Internal.Core.Types import Development.Shake.Internal.Core.Action import Development.Shake.Internal.History.Shared import Development.Shake.Internal.History.Cloud import Development.Shake.Internal.Options import Development.Shake.Internal.Core.Monad import General.Wait import qualified Data.ByteString.Char8 as BS import Control.Monad.IO.Class import General.Extra import General.Intern(Id) import Control.Exception import Control.Monad.Extra import Numeric.Extra import qualified Data.HashMap.Strict as Map import Development.Shake.Internal.Core.Rules import Data.Typeable import Data.Maybe import Data.List.Extra import Data.Either.Extra import System.Time.Extra --------------------------------------------------------------------- -- LOW-LEVEL OPERATIONS ON THE DATABASE setIdKeyStatus :: Global -> Database -> Id -> Key -> Status -> Locked () setIdKeyStatus Global{..} db i k v = do liftIO $ globalDiagnostic $ do -- actually safe because we only lose the Locked to enter the diagnostic context old <- getKeyValueFromId db i let changeStatus = maybe "Missing" (statusType . snd) old ++ " -> " ++ statusType v ++ ", " ++ maybe "" (show . fst) old let changeValue = case v of Ready r -> Just $ " = " ++ showBracket (result r) ++ " " ++ if changed r == globalStep then "(changed)" else if built r == globalStep then "(unchanged)" else "(didn't run)" _ -> Nothing pure $ changeStatus ++ maybe "" ("\n" ++) changeValue setMem db i k v --------------------------------------------------------------------- -- QUERIES getDatabaseValue :: (RuleResult key ~ value, ShakeValue key, Typeable value) => key -> Action (Maybe (Result (Either BS.ByteString value))) getDatabaseValue k = fmap (fmap $ fmap $ fmap fromValue) $ getDatabaseValueGeneric $ newKey k getDatabaseValueGeneric :: Key -> Action (Maybe (Result (Either BS.ByteString Value))) getDatabaseValueGeneric k = do Global{..} <- Action getRO Just status <- liftIO $ getValueFromKey globalDatabase k pure $ getResult status --------------------------------------------------------------------- -- NEW STYLE PRIMITIVES -- | Lookup the value for a single Id, may need to spawn it lookupOne :: Global -> Stack -> Database -> Id -> Wait Locked (Either SomeException (Result (Value, BS_Store))) lookupOne global stack database i = do res <- quickly $ liftIO $ getKeyValueFromId database i case res of Nothing -> Now $ Left $ errorStructured "Shake Id no longer exists" [("Id", Just $ show i)] "" Just (k, s) -> case s of Ready r -> Now $ Right r Failed e _ -> Now $ Left e Running{} | Left e <- addStack i k stack -> Now $ Left e _ -> Later $ \continue -> do Just (_, s) <- liftIO $ getKeyValueFromId database i case s of Ready r -> continue $ Right r Failed e _ -> continue $ Left e Running (NoShow w) r -> do let w2 v = w v >> continue v setMem database i k $ Running (NoShow w2) r Loaded r -> buildOne global stack database i k (Just r) `fromLater` continue Missing -> buildOne global stack database i k Nothing `fromLater` continue -- | Build a key, must currently be either Loaded or Missing, changes to Waiting buildOne :: Global -> Stack -> Database -> Id -> Key -> Maybe (Result BS.ByteString) -> Wait Locked (Either SomeException (Result (Value, BS_Store))) buildOne global@Global{..} stack database i k r = case addStack i k stack of Left e -> do quickly $ setIdKeyStatus global database i k $ mkError e pure $ Left e Right stack -> Later $ \continue -> do setIdKeyStatus global database i k (Running (NoShow continue) r) let go = buildRunMode global stack database r fromLater go $ \mode -> liftIO $ addPool PoolStart globalPool $ runKey global stack k r mode $ \res -> do runLocked database $ do let val = fmap runValue res res <- liftIO $ getKeyValueFromId database i w <- case res of Just (_, Running (NoShow w) _) -> pure w -- We used to be able to hit here, but we fixed it by ensuring the thread pool workers are all -- dead _before_ any exception bubbles up _ -> throwM $ errorInternal $ "expected Waiting but got " ++ maybe "nothing" (statusType . snd) res ++ ", key " ++ show k setIdKeyStatus global database i k $ either mkError Ready val w val case res of Right RunResult{..} | runChanged /= ChangedNothing -> setDisk database i k $ Loaded runValue{result=runStore} _ -> pure () where mkError e = Failed e $ if globalOneShot then Nothing else r -- | Compute the value for a given RunMode and a restore function to run buildRunMode :: Global -> Stack -> Database -> Maybe (Result a) -> Wait Locked RunMode buildRunMode global stack database me = do changed <- case me of Nothing -> pure True Just me -> buildRunDependenciesChanged global stack database me pure $ if changed then RunDependenciesChanged else RunDependenciesSame -- | Have the dependencies changed buildRunDependenciesChanged :: Global -> Stack -> Database -> Result a -> Wait Locked Bool buildRunDependenciesChanged global stack database me = isJust <$> firstJustM id [firstJustWaitUnordered (fmap test . lookupOne global stack database) x | Depends x <- depends me] where test (Right dep) | changed dep <= built me = Nothing test _ = Just () --------------------------------------------------------------------- -- ACTUAL WORKERS applyKeyValue :: [String] -> [Key] -> Action [Value] applyKeyValue callStack ks = do -- this is the only place a user can inject a key into our world, so check they aren't throwing -- in unevaluated bottoms liftIO $ mapM_ (evaluate . rnf) ks global@Global{..} <- Action getRO Local{localStack, localBlockApply} <- Action getRW let stack = addCallStack callStack localStack let tk = typeKey $ headDef (newKey ()) ks -- always called at non-empty so never see () key whenJust localBlockApply $ throwM . errorNoApply tk (show <$> listToMaybe ks) let database = globalDatabase (is, wait) <- liftIO $ runLocked database $ do is <- mapM (mkId database) ks wait <- runWait $ do x <- firstJustWaitUnordered (fmap (either Just (const Nothing)) . lookupOne global stack database) $ nubOrd is case x of Just e -> pure $ Left e Nothing -> quickly $ Right <$> mapM (fmap (\(Just (_, Ready r)) -> fst $ result r) . liftIO . getKeyValueFromId database) is pure (is, wait) Action $ modifyRW $ \s -> s{localDepends = addDepends1 (localDepends s) $ Depends is} case wait of Now vs -> either throwM pure vs _ -> do offset <- liftIO offsetTime vs <- Action $ captureRAW $ \continue -> runLocked globalDatabase $ fromLater wait $ \x -> liftIO $ addPool (if isLeft x then PoolException else PoolResume) globalPool $ continue x offset <- liftIO offset Action $ modifyRW $ addDiscount offset pure vs runKey :: Global -> Stack -- Given the current stack with the key added on -> Key -- The key to build -> Maybe (Result BS.ByteString) -- A previous result, or Nothing if never been built before -> RunMode -- True if any of the children were dirty -> Capture (Either SomeException (RunResult (Result (Value, BS_Store)))) -- Either an error, or a (the produced files, the result). runKey global@Global{globalOptions=ShakeOptions{..},..} stack k r mode continue = do let tk = typeKey k BuiltinRule{..} <- case Map.lookup tk globalRules of Nothing -> throwM $ errorNoRuleToBuildType tk (Just $ show k) Nothing Just r -> pure r let s = (newLocal stack shakeVerbosity){localBuiltinVersion = builtinVersion} time <- offsetTime runAction global s (do res <- builtinRun k (fmap result r) mode liftIO $ evaluate $ rnf res -- completed, now track anything required afterwards when (runChanged res `elem` [ChangedRecomputeSame,ChangedRecomputeDiff]) $ do -- if the users code didn't run you don't have to check anything (we assume builtin rules are correct) globalRuleFinished k producesCheck Action $ fmap (res,) getRW) $ \case Left e -> continue . Left . toException =<< shakeException global stack e Right (RunResult{..}, Local{..}) | runChanged == ChangedNothing || runChanged == ChangedStore, Just r <- r -> continue $ Right $ RunResult runChanged runStore (r{result = mkResult runValue runStore}) | otherwise -> do dur <- time let (cr, c) | Just r <- r, runChanged == ChangedRecomputeSame = (ChangedRecomputeSame, changed r) | otherwise = (ChangedRecomputeDiff, globalStep) continue $ Right $ RunResult cr runStore Result {result = mkResult runValue runStore ,changed = c ,built = globalStep ,depends = flattenDepends localDepends ,execution = doubleToFloat $ dur - localDiscount ,traces = flattenTraces localTraces} where mkResult value store = (value, if globalOneShot then BS.empty else store) --------------------------------------------------------------------- -- USER key/value WRAPPERS -- | Execute a rule, returning the associated values. If possible, the rules will be run in parallel. -- This function requires that appropriate rules have been added with 'addBuiltinRule'. -- All @key@ values passed to 'apply' become dependencies of the 'Action'. apply :: (Partial, RuleResult key ~ value, ShakeValue key, Typeable value) => [key] -> Action [value] apply [] = -- if they do [] then we don't test localBlockApply, but unclear if that should be an error or not pure [] apply ks = fmap (map fromValue) $ Action $ stepRAW (callStackFull, map newKey ks) -- | Apply a single rule, equivalent to calling 'apply' with a singleton list. Where possible, -- use 'apply' to allow parallelism. apply1 :: (Partial, RuleResult key ~ value, ShakeValue key, Typeable value) => key -> Action value apply1 = withFrozenCallStack $ fmap headErr . apply . pure --------------------------------------------------------------------- -- HISTORY STUFF -- | Load a value from the history. Given a version from any user rule -- (or @0@), return the payload that was stored by 'historySave'. -- -- If this function returns 'Just' it will also have restored any files that -- were saved by 'historySave'. historyLoad :: Int -> Action (Maybe BS.ByteString) historyLoad (Ver -> ver) = do global@Global{..} <- Action getRO Local{localStack, localBuiltinVersion} <- Action getRW if isNothing globalShared && isNothing globalCloud then pure Nothing else do key <- liftIO $ evaluate $ fromMaybe (error "Can't call historyLoad outside a rule") $ topStack localStack let database = globalDatabase res <- liftIO $ runLocked database $ runWait $ do let ask k = do i <- quickly $ mkId database k let identify = runIdentify globalRules k . fst . result either (const Nothing) identify <$> lookupOne global localStack database i x <- case globalShared of Nothing -> pure Nothing Just shared -> lookupShared shared ask key localBuiltinVersion ver x <- case x of Just res -> pure $ Just res Nothing -> case globalCloud of Nothing -> pure Nothing Just cloud -> lookupCloud cloud ask key localBuiltinVersion ver case x of Nothing -> pure Nothing Just (a,b,c) -> quickly $ Just . (a,,c) <$> mapM (mapM $ mkId database) b -- FIXME: If running with cloud and shared, and you got a hit in cloud, should also add it to shared res <- case res of Now x -> pure x _ -> do offset <- liftIO offsetTime res <- Action $ captureRAW $ \continue -> runLocked globalDatabase $ fromLater res $ \x -> liftIO $ addPool PoolResume globalPool $ continue $ Right x offset <- liftIO offset Action $ modifyRW $ addDiscount offset pure res case res of Nothing -> pure Nothing Just (res, deps, restore) -> do liftIO $ globalDiagnostic $ pure $ "History hit for " ++ show key liftIO restore Action $ modifyRW $ \s -> s{localDepends = newDepends $ map Depends deps} pure (Just res) -- | Is the history enabled, returns 'True' if you have a 'shakeShare' or 'shakeCloud', -- and haven't called 'historyDisable' so far in this rule. historyIsEnabled :: Action Bool historyIsEnabled = Action $ do Global{..} <- getRO Local{localHistory} <- getRW pure $ localHistory && (isJust globalShared || isJust globalCloud) -- | Save a value to the history. Record the version of any user rule -- (or @0@), and a payload. Must be run at the end of the rule, after -- any dependencies have been captured. If history is enabled, stores the information -- in a cache. -- -- This function relies on 'produces' to have been called correctly to describe -- which files were written during the execution of this rule. historySave :: Int -> BS.ByteString -> Action () historySave (Ver -> ver) store = whenM historyIsEnabled $ Action $ do Global{..} <- getRO Local{localProduces, localDepends, localBuiltinVersion, localStack} <- getRW liftIO $ do -- make sure we throw errors before we get into the history evaluate ver evaluate store key <- evaluate $ fromMaybe (error "Can't call historySave outside a rule") $ topStack localStack let produced = reverse $ map snd localProduces deps <- -- can do this without the DB lock, since it reads things that are stable forNothingM (flattenDepends localDepends) $ \(Depends is) -> forNothingM is $ \i -> do Just (k, Ready r) <- getKeyValueFromId globalDatabase i pure $ (k,) <$> runIdentify globalRules k (fst $ result r) let k = topStack localStack case deps of Nothing -> liftIO $ globalDiagnostic $ pure $ "Dependency with no identity for " ++ show k Just deps -> do whenJust globalShared $ \shared -> addShared shared key localBuiltinVersion ver deps store produced whenJust globalCloud $ \cloud -> addCloud cloud key localBuiltinVersion ver deps store produced liftIO $ globalDiagnostic $ pure $ "History saved for " ++ show k runIdentify :: Map.HashMap TypeRep BuiltinRule -> Key -> Value -> Maybe BS.ByteString runIdentify mp k v | Just BuiltinRule{..} <- Map.lookup (typeKey k) mp = builtinIdentity k v | otherwise = throwImpure $ errorInternal "runIdentify can't find rule" shake-0.19.8/src/Development/Shake/Internal/Core/Database.hs0000644000000000000000000000705507346545000021714 0ustar0000000000000000{-# LANGUAGE BangPatterns #-} {-# LANGUAGE GeneralizedNewtypeDeriving, RecordWildCards #-} module Development.Shake.Internal.Core.Database( Locked, runLocked, DatabasePoly, createDatabase, mkId, getValueFromKey, getIdFromKey, getKeyValues, getKeyValueFromId, getKeyValuesFromId, setMem, setDisk, modifyAllMem ) where import Data.Tuple.Extra import Data.IORef.Extra import General.Intern(Id, Intern) import Development.Shake.Classes import qualified Data.HashMap.Strict as Map import qualified General.Intern as Intern import Control.Concurrent.Extra import Control.Monad.IO.Class import qualified General.Ids as Ids import Control.Monad.Fail import Prelude newtype Locked a = Locked (IO a) deriving (Functor, Applicative, Monad, MonadIO, MonadFail) runLocked :: DatabasePoly k v -> Locked b -> IO b runLocked db (Locked act) = withLock (lock db) act -- | Invariant: The database does not have any cycles where a Key depends on itself. -- Everything is mutable. intern and status must form a bijection. -- There may be dangling Id's as a result of version changes. -- Lock is used to prevent any torn updates data DatabasePoly k v = Database {lock :: Lock ,intern :: IORef (Intern k) -- ^ Key |-> Id mapping ,status :: Ids.Ids (k, v) -- ^ Id |-> (Key, Status) mapping ,journal :: Id -> k -> v -> IO () -- ^ Record all changes to status ,vDefault :: v } createDatabase :: (Eq k, Hashable k) => Ids.Ids (k, v) -> (Id -> k -> v -> IO ()) -> v -> IO (DatabasePoly k v) createDatabase status journal vDefault = do xs <- Ids.toList status intern <- newIORef $ Intern.fromList [(k, i) | (i, (k,_)) <- xs] lock <- newLock pure Database{..} --------------------------------------------------------------------- -- SAFE READ-ONLY getValueFromKey :: (Eq k, Hashable k) => DatabasePoly k v -> k -> IO (Maybe v) getValueFromKey Database{..} k = do is <- readIORef intern case Intern.lookup k is of Nothing -> pure Nothing Just i -> fmap snd <$> Ids.lookup status i -- Returns Nothing only if the Id was serialised previously but then the Id disappeared getKeyValueFromId :: DatabasePoly k v -> Id -> IO (Maybe (k, v)) getKeyValueFromId Database{..} = Ids.lookup status getKeyValues :: DatabasePoly k v -> IO [(k, v)] getKeyValues Database{..} = Ids.elems status getKeyValuesFromId :: DatabasePoly k v -> IO (Map.HashMap Id (k, v)) getKeyValuesFromId Database{..} = Ids.toMap status getIdFromKey :: (Eq k, Hashable k) => DatabasePoly k v -> IO (k -> Maybe Id) getIdFromKey Database{..} = do is <- readIORef intern pure $ flip Intern.lookup is --------------------------------------------------------------------- -- MUTATING -- | Ensure that a Key has a given Id, creating an Id if there is not one already mkId :: (Eq k, Hashable k) => DatabasePoly k v -> k -> Locked Id mkId Database{..} k = liftIO $ do is <- readIORef intern case Intern.lookup k is of Just i -> pure i Nothing -> do (is, i)<- pure $ Intern.add k is -- make sure to write it into Status first to maintain Database invariants Ids.insert status i (k, vDefault) writeIORef' intern is pure i setMem :: DatabasePoly k v -> Id -> k -> v -> Locked () setMem Database{..} i k v = liftIO $ Ids.insert status i (k,v) modifyAllMem :: DatabasePoly k v -> (v -> v) -> Locked () modifyAllMem Database{..} f = liftIO $ Ids.forMutate status $ \(k,v) -> let !v' = f v in (k, v') setDisk :: DatabasePoly k v -> Id -> k -> v -> IO () setDisk = journal shake-0.19.8/src/Development/Shake/Internal/Core/Monad.hs0000644000000000000000000001672507346545000021252 0ustar0000000000000000{-# LANGUAGE LambdaCase #-} {-# LANGUAGE GADTs, ScopedTypeVariables, TupleSections, GeneralizedNewtypeDeriving #-} module Development.Shake.Internal.Core.Monad( RAW, Capture, runRAW, getRO, getRW, putRW, modifyRW, stepRAW, catchRAW, tryRAW, throwRAW, finallyRAW, captureRAW, ) where import Control.Exception.Extra import Development.Shake.Internal.Errors import Control.Monad.IO.Class import Data.IORef import Control.Monad import System.IO import Data.Semigroup import Control.Monad.Fail import Prelude data RAW k v ro rw a where Fmap :: (a -> b) -> RAW k v ro rw a -> RAW k v ro rw b Pure :: a -> RAW k v ro rw a Ap :: RAW k v ro rw (a -> b) -> RAW k v ro rw a -> RAW k v ro rw b Next :: RAW k v ro rw a -> RAW k v ro rw b -> RAW k v ro rw b Bind :: RAW k v ro rw a -> (a -> RAW k v ro rw b) -> RAW k v ro rw b LiftIO :: IO a -> RAW k v ro rw a GetRO :: RAW k v ro rw ro GetRW :: RAW k v ro rw rw PutRW :: !rw -> RAW k v ro rw () ModifyRW :: (rw -> rw) -> RAW k v ro rw () StepRAW :: k -> RAW k v ro rw v CaptureRAW :: Capture (Either SomeException a) -> RAW k v ro rw a CatchRAW :: RAW k v ro rw a -> (SomeException -> RAW k v ro rw a) -> RAW k v ro rw a instance Functor (RAW k v ro rw) where fmap = Fmap instance Applicative (RAW k v ro rw) where pure = Pure (*>) = Next (<*>) = Ap instance Monad (RAW k v ro rw) where return = pure (>>) = (*>) (>>=) = Bind instance MonadIO (RAW k v ro rw) where liftIO = LiftIO instance MonadFail (RAW k v ro rw) where fail = liftIO . Control.Monad.Fail.fail instance Semigroup a => Semigroup (RAW k v ro rw a) where (<>) a b = (<>) <$> a <*> b instance (Semigroup a, Monoid a) => Monoid (RAW k v ro rw a) where mempty = pure mempty mappend = (<>) type Capture a = (a -> IO ()) -> IO () -- Useful for checking that all continuations are run only once -- Cannot be enabled for performance reasons and because some of -- "monad test" deliberately breaks the invariant to check it doesn't go wrong assertOnceCheck = False assertOnce :: MonadIO m => String -> (a -> m b) -> IO (a -> m b) assertOnce msg k | not assertOnceCheck = pure k | otherwise = do ref <- liftIO $ newIORef False pure $ \v -> do liftIO $ join $ atomicModifyIORef ref $ \old -> (True,) $ when old $ do hPutStrLn stderr "FATAL ERROR: assertOnce failed" Prelude.fail $ "assertOnce failed: " ++ msg k v -- | Run and then call a continuation. runRAW :: ([k] -> RAW k v ro rw [v]) -> ro -> rw -> RAW k v ro rw a -> Capture (Either SomeException a) runRAW step ro rw m k = do k <- assertOnce "runRAW" k rw <- newIORef rw handler <- newIORef throwIO steps <- newSteps writeIORef handler $ \e -> do -- make sure we never call the error continuation twice writeIORef handler throwIO k $ Left e -- If the continuation itself throws an error we need to make sure we -- don't end up running it twice (once with its result, once with its own exception) goRAW step steps handler ro rw m (\v -> do writeIORef handler throwIO; k $ Right v) `catch_` \e -> ($ e) =<< readIORef handler goRAW :: forall k v ro rw a . ([k] -> RAW k v ro rw [v]) -> Steps k v -> IORef (SomeException -> IO ()) -> ro -> IORef rw -> RAW k v ro rw a -> Capture a goRAW step steps handler ro rw = \x k -> go x $ \v -> sio v k where sio :: SIO b -> Capture b sio (SIO v) k = flush $ do v <- v; k v flush :: IO () -> IO () flush k = do v <- flushSteps steps case v of Nothing -> k Just f -> go (f step) $ const k unflush :: IO () unflush = unflushSteps steps go :: RAW k v ro rw b -> Capture (SIO b) go x k = case x of Fmap f a -> go a $ \v -> k $ fmap f v Pure a -> k $ pure a Ap f x -> go f $ \f -> go x $ \v -> k $ f <*> v Next a b -> go a $ \a -> go b $ \b -> k $ a *> b StepRAW q -> do v <- addStep steps q k v Bind a b -> go a $ \a -> sio a $ \a -> go (b a) k LiftIO act -> flush $ do v <- act; k $ pure v GetRO -> k $ pure ro GetRW -> flush $ k . pure =<< readIORef rw PutRW x -> flush $ writeIORef rw x >> k (pure ()) ModifyRW f -> flush $ modifyIORef' rw f >> k (pure ()) CatchRAW m hdl -> flush $ do hdl <- assertOnce "CatchRAW" hdl old <- readIORef handler writeIORef handler $ \e -> do writeIORef handler old go (hdl e) k `catch_` \e -> do unflush; ($ e) =<< readIORef handler go m $ \x -> writeIORef handler old >> k x CaptureRAW f -> flush $ do f <- assertOnce "CaptureRAW" f old <- readIORef handler writeIORef handler throwIO f $ \case Left e -> old e Right v -> do writeIORef handler old k (pure v) `catch_` \e -> do unflush; ($ e) =<< readIORef handler newtype SIO a = SIO (IO a) deriving (Functor, Monad, Applicative) newtype Steps k v = Steps (IORef [(k, IORef v)]) newSteps :: IO (Steps k v) newSteps = Steps <$> newIORef [] addStep :: Steps k v -> k -> IO (SIO v) addStep (Steps ref) k = do out <- newIORef $ throwImpure $ errorInternal "Monad, addStep not flushed" modifyIORef ref ((k,out):) pure $ SIO $ readIORef out unflushSteps :: Steps k v -> IO () unflushSteps (Steps ref) = writeIORef ref [] flushSteps :: MonadIO m => Steps k v -> IO (Maybe (([k] -> m [v]) -> m ())) flushSteps (Steps ref) = do v <- reverse <$> readIORef ref case v of [] -> pure Nothing xs -> do writeIORef ref [] pure $ Just $ \step -> do vs <- step $ map fst xs liftIO $ zipWithM_ writeIORef (map snd xs) vs --------------------------------------------------------------------- -- STANDARD getRO :: RAW k v ro rw ro getRO = GetRO getRW :: RAW k v ro rw rw getRW = GetRW -- | Strict version putRW :: rw -> RAW k v ro rw () putRW = PutRW modifyRW :: (rw -> rw) -> RAW k v ro rw () modifyRW = ModifyRW --------------------------------------------------------------------- -- EXCEPTIONS catchRAW :: RAW k v ro rw a -> (SomeException -> RAW k v ro rw a) -> RAW k v ro rw a catchRAW = CatchRAW tryRAW :: RAW k v ro rw a -> RAW k v ro rw (Either SomeException a) tryRAW m = catchRAW (fmap Right m) (pure . Left) throwRAW :: Exception e => e -> RAW k v ro rw a -- Note that while we could directly pass this to the handler -- that would avoid triggering the catch, which would mean they built up on the stack throwRAW = liftIO . throwIO finallyRAW :: RAW k v ro rw a -> RAW k v ro rw b -> RAW k v ro rw a finallyRAW a undo = do r <- catchRAW a (\e -> undo >> throwRAW e) undo pure r --------------------------------------------------------------------- -- CONTINUATIONS -- | Capture a continuation. The continuation should be called at most once. -- Calling the same continuation, multiple times, in parallel, results in incorrect behaviour. captureRAW :: Capture (Either SomeException a) -> RAW k v ro rw a captureRAW = CaptureRAW --------------------------------------------------------------------- -- STEPS stepRAW :: k -> RAW k v ro rw v stepRAW = StepRAW shake-0.19.8/src/Development/Shake/Internal/Core/Pool.hs0000644000000000000000000000617007346545000021116 0ustar0000000000000000{-# LANGUAGE RecordWildCards, TupleSections #-} module Development.Shake.Internal.Core.Pool( addPoolWait, actionFenceSteal, actionFenceRequeue, actionAlwaysRequeue, actionAlwaysRequeuePriority, addPoolWait_, actionFenceRequeueBy ) where import Control.Exception import General.Pool import Development.Shake.Internal.Core.Types import Development.Shake.Internal.Core.Monad import System.Time.Extra import Data.Either.Extra import Control.Monad.IO.Class import General.Fence priority x = if isLeft x then PoolException else PoolResume -- | Enqueue an Action into the pool and return a Fence to wait for it. -- Returns the value along with how long it spent executing. addPoolWait :: PoolPriority -> Action a -> Action (Fence IO (Either SomeException (Seconds, a))) addPoolWait pri act = do ro@Global{..} <- Action getRO rw <- Action getRW liftIO $ do fence <- newFence let act2 = do offset <- liftIO offsetTime; res <- act; offset <- liftIO offset; pure (offset, res) addPool pri globalPool $ runAction ro rw act2 $ signalFence fence pure fence -- | Like 'addPoolWait' but doesn't provide a fence to wait for it - a fire and forget version. -- Warning: If Action throws an exception, it would be lost, so must be executed with try. Seconds are not tracked. addPoolWait_ :: PoolPriority -> Action a -> Action () addPoolWait_ pri act = do ro@Global{..} <- Action getRO rw <- Action getRW liftIO $ addPool pri globalPool $ runAction ro rw act $ \_ -> pure () actionFenceSteal :: Fence IO (Either SomeException a) -> Action (Seconds, a) actionFenceSteal fence = do res <- liftIO $ testFence fence case res of Just (Left e) -> Action $ throwRAW e Just (Right v) -> pure (0, v) Nothing -> Action $ captureRAW $ \continue -> do offset <- offsetTime waitFence fence $ \v -> do offset <- offset continue $ (offset,) <$> v actionFenceRequeue :: Fence IO (Either SomeException b) -> Action (Seconds, b) actionFenceRequeue = actionFenceRequeueBy id actionFenceRequeueBy :: (a -> Either SomeException b) -> Fence IO a -> Action (Seconds, b) actionFenceRequeueBy op fence = Action $ do res <- liftIO $ testFence fence case fmap op res of Just (Left e) -> throwRAW e Just (Right v) -> pure (0, v) Nothing -> do Global{..} <- getRO offset <- liftIO offsetTime captureRAW $ \continue -> waitFence fence $ \v -> do let v2 = op v addPool (priority v2) globalPool $ do offset <- offset continue $ (offset,) <$> v2 actionAlwaysRequeue :: Either SomeException a -> Action (Seconds, a) actionAlwaysRequeue res = actionAlwaysRequeuePriority (priority res) res actionAlwaysRequeuePriority :: PoolPriority -> Either SomeException a -> Action (Seconds, a) actionAlwaysRequeuePriority pri res = Action $ do Global{..} <- getRO offset <- liftIO offsetTime captureRAW $ \continue -> addPool pri globalPool $ do offset <- offset continue $ (offset,) <$> res shake-0.19.8/src/Development/Shake/Internal/Core/Rules.hs0000644000000000000000000003650207346545000021301 0ustar0000000000000000{-# LANGUAGE RecordWildCards, ScopedTypeVariables #-} {-# LANGUAGE GeneralizedNewtypeDeriving, ConstraintKinds, NamedFieldPuns #-} {-# LANGUAGE ExistentialQuantification, RankNTypes #-} {-# LANGUAGE TypeFamilies, TypeOperators, DeriveDataTypeable #-} {-# LANGUAGE FlexibleInstances #-} module Development.Shake.Internal.Core.Rules( Rules, SRules(..), runRules, RuleResult, addBuiltinRule, addBuiltinRuleEx, noLint, noIdentity, getShakeOptionsRules, getUserRuleInternal, getUserRuleOne, getUserRuleList, getUserRuleMaybe, addUserRule, alternatives, priority, versioned, getTargets, addTarget, withTargetDocs, withoutTargets, addHelpSuffix, getHelpSuffix, action, withoutActions ) where import Control.Applicative import Data.Tuple.Extra import Control.Exception import Control.Monad.Extra import Control.Monad.Fix import Control.Monad.IO.Class import Control.Monad.Trans.Reader import Development.Shake.Classes import General.Binary import General.Extra import Data.Typeable import Data.Data import Data.List.Extra import qualified Data.HashMap.Strict as Map import qualified General.TypeMap as TMap import Data.Maybe import Data.IORef import Data.Semigroup import qualified Data.ByteString.Lazy as LBS import qualified Data.Binary.Builder as Bin import Data.Binary.Put import Data.Binary.Get import General.ListBuilder import Control.Monad.Fail import Prelude import Development.Shake.Internal.Core.Types import Development.Shake.Internal.Core.Monad import Development.Shake.Internal.Value import Development.Shake.Internal.Options import Development.Shake.Internal.Errors --------------------------------------------------------------------- -- RULES -- | Get the 'ShakeOptions' that were used. getShakeOptionsRules :: Rules ShakeOptions getShakeOptionsRules = Rules $ asks fst -- | Internal variant, more flexible, but not such a nice API -- Same args as getuserRuleMaybe, but returns (guaranteed version, items, error to throw if wrong number) -- Fields are returned lazily, in particular ver can be looked up cheaper getUserRuleInternal :: forall key a b . (ShakeValue key, Typeable a) => key -> (a -> Maybe String) -> (a -> Maybe b) -> Action (Maybe Ver, [(Int, b)], SomeException) getUserRuleInternal key disp test = do Global{..} <- Action getRO let UserRuleVersioned versioned rules = fromMaybe mempty $ TMap.lookup globalUserRules let ver = if versioned then Nothing else Just $ Ver 0 let items = headDef [] $ map snd $ reverse $ groupSort $ f (Ver 0) Nothing rules let err = errorMultipleRulesMatch (typeOf key) (show key) (map snd3 items) pure (ver, map (\(Ver v,_,x) -> (v,x)) items, err) where f :: Ver -> Maybe Double -> UserRule a -> [(Double,(Ver,Maybe String,b))] f v p (UserRule x) = [(fromMaybe 1 p, (v,disp x,x2)) | Just x2 <- [test x]] f v p (Unordered xs) = concatMap (f v p) xs f v p (Priority p2 x) = f v (Just $ fromMaybe p2 p) x f _ p (Versioned v x) = f v p x f v p (Alternative x) = take 1 $ f v p x -- | Get the user rules that were added at a particular type which return 'Just' on a given function. -- Return all equally applicable rules, paired with the version of the rule -- (set by 'versioned'). Where rules are specified with 'alternatives' or 'priority' -- the less-applicable rules will not be returned. -- -- If you can only deal with zero/one results, call 'getUserRuleMaybe' or 'getUserRuleOne', -- which raise informative errors. getUserRuleList :: Typeable a => (a -> Maybe b) -> Action [(Int, b)] getUserRuleList test = snd3 <$> getUserRuleInternal () (const Nothing) test -- | A version of 'getUserRuleList' that fails if there is more than one result -- Requires a @key@ for better error messages. getUserRuleMaybe :: (ShakeValue key, Typeable a) => key -> (a -> Maybe String) -> (a -> Maybe b) -> Action (Maybe (Int, b)) getUserRuleMaybe key disp test = do (_, xs, err) <- getUserRuleInternal key disp test case xs of [] -> pure Nothing [x] -> pure $ Just x _ -> throwM err -- | A version of 'getUserRuleList' that fails if there is not exactly one result -- Requires a @key@ for better error messages. getUserRuleOne :: (ShakeValue key, Typeable a) => key -> (a -> Maybe String) -> (a -> Maybe b) -> Action (Int, b) getUserRuleOne key disp test = do (_, xs, err) <- getUserRuleInternal key disp test case xs of [x] -> pure x _ -> throwM err -- | Define a set of rules. Rules can be created with calls to functions such as 'Development.Shake.%>' or 'action'. -- Rules are combined with either the 'Monoid' instance, or (more commonly) the 'Monad' instance and @do@ notation. -- To define your own custom types of rule, see "Development.Shake.Rule". newtype Rules a = Rules (ReaderT (ShakeOptions, IORef (SRules ListBuilder)) IO a) -- All IO must be associative/commutative (e.g. creating IORef/MVars) deriving (Functor, Applicative, Monad, MonadIO, MonadFix, Control.Monad.Fail.MonadFail) newRules :: SRules ListBuilder -> Rules () newRules x = Rules $ liftIO . flip modifyIORef' (<> x) =<< asks snd modifyRulesScoped :: (SRules ListBuilder -> SRules ListBuilder) -> Rules a -> Rules a modifyRulesScoped f (Rules r) = Rules $ do (opts, refOld) <- ask liftIO $ do refNew <- newIORef mempty res <- runReaderT r (opts, refNew) rules <- readIORef refNew modifyIORef' refOld (<> f rules) pure res runRules :: ShakeOptions -> Rules () -> IO (SRules []) runRules opts (Rules r) = do ref <- newIORef mempty{allowOverwrite = shakeAllowRedefineRules opts} runReaderT r (opts, ref) SRules{..} <- readIORef ref pure $ SRules (runListBuilder actions) builtinRules userRules (runListBuilder targets) (runListBuilder helpSuffix) allowOverwrite -- | Get all targets registered in the given rules. The names in -- 'Development.Shake.phony' and 'Development.Shake.~>' as well as the file patterns -- in 'Development.Shake.%>', 'Development.Shake.|%>' and 'Development.Shake.&%>' are -- registered as targets, plus any explicit calls to 'addTarget'. -- Returns the command, paired with the documentation (if any). getTargets :: ShakeOptions -> Rules () -> IO [(String, Maybe String)] getTargets opts rs = do SRules{targets} <- runRules opts rs pure [(target, documentation) | Target{..} <- targets] getHelpSuffix :: ShakeOptions -> Rules () -> IO [String] getHelpSuffix opts rs = do SRules{helpSuffix} <- runRules opts rs pure helpSuffix data Target = Target {target :: !String ,documentation :: !(Maybe String) } deriving (Eq,Ord,Show,Read,Data,Typeable) data SRules list = SRules {actions :: !(list (Stack, Action ())) ,builtinRules :: !(Map.HashMap TypeRep{-k-} BuiltinRule) ,userRules :: !(TMap.Map UserRuleVersioned) ,targets :: !(list Target) ,helpSuffix :: !(list String) ,allowOverwrite :: Bool } instance Semigroup (SRules ListBuilder) where (SRules x1 x2 x3 x4 x5 x6) <> (SRules y1 y2 y3 y4 y5 y6) = SRules (mappend x1 y1) (Map.unionWithKey f x2 y2) (TMap.unionWith (<>) x3 y3) (mappend x4 y4) (mappend x5 y5) canOverwrite where canOverwrite = x6 && y6 f k a b | canOverwrite = b | otherwise = throwImpure $ errorRuleDefinedMultipleTimes k [builtinLocation a, builtinLocation b] instance Monoid (SRules ListBuilder) where mempty = SRules mempty Map.empty TMap.empty mempty mempty True mappend = (<>) instance Semigroup a => Semigroup (Rules a) where (<>) = liftA2 (<>) instance (Semigroup a, Monoid a) => Monoid (Rules a) where mempty = pure mempty mappend = (<>) -- | Add a user rule. In general these should be specialised to the type expected by a builtin rule. -- The user rules can be retrieved by 'getUserRuleList'. addUserRule :: Typeable a => a -> Rules () addUserRule r = newRules mempty{userRules = TMap.singleton $ UserRuleVersioned False $ UserRule r} -- | Register a target, as available when passing @--help@ or through 'getTargets'. -- Called automatically by rules such as 'Development.Shake.phony' and -- 'Development.Shake.%>' - to avoid that use 'withoutTargets'. -- To add documentation to a target use 'withTargetDocs'. addTarget :: String -> Rules () addTarget t = newRules mempty{targets = newListBuilder $ Target t Nothing} -- | For all 'addTarget' targets within the 'Rules' provide the specified documentation, if they -- don't already have documentation. withTargetDocs :: String -> Rules () -> Rules () withTargetDocs d = modifyRulesScoped $ \x -> x{targets = f <$> targets x} where f (Target a b) = Target a $ Just $ fromMaybe d b -- | Remove all targets specified in a set of rules, typically because they are internal details. -- Overrides 'addTarget'. withoutTargets :: Rules a -> Rules a withoutTargets = modifyRulesScoped $ \x -> x{targets=mempty} -- | Adds some extra information at the end of @--help@. addHelpSuffix :: String -> Rules () addHelpSuffix s = newRules mempty{helpSuffix = newListBuilder s} -- | A suitable 'BuiltinLint' that always succeeds. noLint :: BuiltinLint key value noLint _ _ = pure Nothing -- | A suitable 'BuiltinIdentity' that always fails with a runtime error, incompatible with 'shakeShare'. -- Use this function if you don't care about 'shakeShare', or if your rule provides a dependency that can -- never be cached (in which case you should also call 'Development.Shake.historyDisable'). noIdentity :: BuiltinIdentity key value noIdentity _ _ = Nothing -- | The type mapping between the @key@ or a rule and the resulting @value@. -- See 'addBuiltinRule' and 'Development.Shake.Rule.apply'. type family RuleResult key -- = value -- | Before looking at this function, you should read the warnings at the top of this module. -- This function is not often necessary in build systems. -- -- Define a builtin rule, passing the functions to run in the right circumstances. -- The @key@ and @value@ types will be what is used by 'Development.Shake.Rule.apply'. -- As a start, you can use 'noLint' and 'noIdentity' as the first two functions, -- but are required to supply a suitable 'BuiltinRun'. -- -- Raises an error if any other rule exists at this type. -- -- For a worked example of writing a rule see . addBuiltinRule :: (RuleResult key ~ value, ShakeValue key, Typeable value, NFData value, Show value, Partial) => BuiltinLint key value -> BuiltinIdentity key value -> BuiltinRun key value -> Rules () addBuiltinRule = withFrozenCallStack $ addBuiltinRuleInternal $ BinaryOp (putEx . Bin.toLazyByteString . execPut . put) (runGet get . LBS.fromChunks . pure) addBuiltinRuleEx :: (RuleResult key ~ value, ShakeValue key, BinaryEx key, Typeable value, NFData value, Show value, Partial) => BuiltinLint key value -> BuiltinIdentity key value -> BuiltinRun key value -> Rules () addBuiltinRuleEx = addBuiltinRuleInternal $ BinaryOp putEx getEx -- | Unexpected version of 'addBuiltinRule', which also lets me set the 'BinaryOp'. addBuiltinRuleInternal :: (RuleResult key ~ value, ShakeValue key, Typeable value, NFData value, Show value, Partial) => BinaryOp key -> BuiltinLint key value -> BuiltinIdentity key value -> BuiltinRun key value -> Rules () addBuiltinRuleInternal binary lint check (run :: BuiltinRun key value) = do let k = Proxy :: Proxy key let lint_ k v = lint (fromKey k) (fromValue v) let check_ k v = check (fromKey k) (fromValue v) let run_ k v b = fmap newValue <$> run (fromKey k) v b let binary_ = BinaryOp (putOp binary . fromKey) (newKey . getOp binary) newRules mempty{builtinRules = Map.singleton (typeRep k) $ BuiltinRule lint_ check_ run_ binary_ (Ver 0) callStackTop} -- | Change the priority of a given set of rules, where higher values take precedence. -- All matching rules at a given priority must be disjoint, or an error is raised. -- All builtin Shake rules have priority between 0 and 1. -- Excessive use of 'priority' is discouraged. As an example: -- -- @ -- 'priority' 4 $ \"hello.*\" %> \\out -> 'writeFile'' out \"hello.*\" -- 'priority' 8 $ \"*.txt\" %> \\out -> 'writeFile'' out \"*.txt\" -- @ -- -- In this example @hello.txt@ will match the second rule, instead of raising an error about ambiguity. -- -- The 'priority' function obeys the invariants: -- -- @ -- 'priority' p1 ('priority' p2 r1) === 'priority' p1 r1 -- 'priority' p1 (r1 >> r2) === 'priority' p1 r1 >> 'priority' p1 r2 -- @ priority :: Double -> Rules a -> Rules a priority d = modifyRulesScoped $ \s -> s{userRules = TMap.map (\(UserRuleVersioned b x) -> UserRuleVersioned b $ Priority d x) $ userRules s} -- | Indicate that the nested rules have a given version. If you change the semantics of the rule then updating (or adding) -- a version will cause the rule to rebuild in some circumstances. -- -- @ -- 'versioned' 1 $ \"hello.*\" %> \\out -> -- 'writeFile'' out \"Writes v1 now\" -- previously wrote out v0 -- @ -- -- You should only use 'versioned' to track changes in the build source, for standard runtime dependencies you should use -- other mechanisms, e.g. 'Development.Shake.addOracle'. versioned :: Int -> Rules a -> Rules a versioned v = modifyRulesScoped $ \s -> s {userRules = TMap.map (\(UserRuleVersioned b x) -> UserRuleVersioned (b || v /= 0) $ Versioned (Ver v) x) $ userRules s ,builtinRules = Map.map (\b -> b{builtinVersion = Ver v}) $ builtinRules s } -- | Change the matching behaviour of rules so rules do not have to be disjoint, but are instead matched -- in order. Only recommended for small blocks containing a handful of rules. -- -- @ -- 'alternatives' $ do -- \"hello.*\" %> \\out -> 'writeFile'' out \"hello.*\" -- \"*.txt\" %> \\out -> 'writeFile'' out \"*.txt\" -- @ -- -- In this example @hello.txt@ will match the first rule, instead of raising an error about ambiguity. -- Inside 'alternatives' the 'priority' of each rule is not used to determine which rule matches, -- but the resulting match uses that priority compared to the rules outside the 'alternatives' block. alternatives :: Rules a -> Rules a alternatives = modifyRulesScoped $ \r -> r{userRules = TMap.map (\(UserRuleVersioned b x) -> UserRuleVersioned b $ Alternative x) $ userRules r} -- | Run an action, usually used for specifying top-level requirements. -- -- @ -- main = 'Development.Shake.shake' 'shakeOptions' $ do -- 'action' $ do -- b <- 'Development.Shake.doesFileExist' \"file.src\" -- when b $ 'Development.Shake.need' [\"file.out\"] -- @ -- -- This 'action' builds @file.out@, but only if @file.src@ exists. The 'action' -- will be run in every build execution (unless 'withoutActions' is used), so only cheap -- operations should be performed. On the flip side, consulting system information -- (e.g. environment variables) can be done directly as the information will not be cached. -- All calls to 'action' may be run in parallel, in any order. -- -- For the standard requirement of only 'Development.Shake.need'ing a fixed list of files in the 'action', -- see 'Development.Shake.want'. action :: Partial => Action a -> Rules () action act = newRules mempty{actions=newListBuilder (addCallStack callStackFull emptyStack, void act)} -- | Remove all actions specified in a set of rules, usually used for implementing -- command line specification of what to build. withoutActions :: Rules a -> Rules a withoutActions = modifyRulesScoped $ \x -> x{actions=mempty} shake-0.19.8/src/Development/Shake/Internal/Core/Run.hs0000644000000000000000000004123507346545000020752 0ustar0000000000000000{-# LANGUAGE LambdaCase #-} {-# LANGUAGE RecordWildCards, ScopedTypeVariables, PatternGuards #-} {-# LANGUAGE ConstraintKinds, TupleSections, ViewPatterns #-} {-# LANGUAGE TypeFamilies, NamedFieldPuns #-} module Development.Shake.Internal.Core.Run( RunState, open, reset, run, shakeRunAfter, liveFilesState, profileState, errorsState ) where import Control.Exception import Data.Tuple.Extra import Control.Concurrent.Extra hiding (withNumCapabilities) import Development.Shake.Internal.Core.Database import Control.Monad.IO.Class import General.Binary import Development.Shake.Classes import Development.Shake.Internal.Core.Storage import Development.Shake.Internal.Core.Build import Development.Shake.Internal.History.Shared import Development.Shake.Internal.History.Cloud import qualified General.TypeMap as TMap import Control.Monad.Extra import Data.Typeable import Numeric.Extra import Data.List.Extra import qualified Data.HashMap.Strict as Map import qualified Data.HashSet as Set import Data.Dynamic import Data.Maybe import Data.IORef.Extra import System.Directory import System.Time.Extra import qualified Data.ByteString as BS import Development.Shake.Internal.Core.Types import Development.Shake.Internal.Core.Action import Development.Shake.Internal.Core.Rules import General.Pool import Development.Shake.Internal.Progress import Development.Shake.Internal.Value import Development.Shake.Internal.Profile import Development.Shake.Internal.Options import Development.Shake.Internal.Errors import General.Timing import General.Thread import General.Extra import General.Cleanup import Data.Monoid import Prelude --------------------------------------------------------------------- -- MAKE data RunState = RunState {opts :: ShakeOptions ,builtinRules :: Map.HashMap TypeRep BuiltinRule ,userRules :: TMap.Map UserRuleVersioned ,database :: Database ,curdir :: FilePath ,shared :: Maybe Shared ,cloud :: Maybe Cloud ,actions :: [(Stack, Action ())] } open :: Cleanup -> ShakeOptions -> Rules () -> IO RunState open cleanup opts rs = withInit opts $ \opts@ShakeOptions{..} diagnostic _ -> do diagnostic $ pure "Starting run" SRules{actions, builtinRules, userRules} <- runRules opts rs diagnostic $ pure $ "Number of actions = " ++ show (length actions) diagnostic $ pure $ "Number of builtin rules = " ++ show (Map.size builtinRules) ++ " " ++ show (Map.keys builtinRules) diagnostic $ pure $ "Number of user rule types = " ++ show (TMap.size userRules) diagnostic $ pure $ "Number of user rules = " ++ show (sum (TMap.toList (userRuleSize . userRuleContents) userRules)) checkShakeExtra shakeExtra curdir <- getCurrentDirectory database <- usingDatabase cleanup opts diagnostic builtinRules (shared, cloud) <- loadSharedCloud database opts builtinRules pure RunState{..} -- Prepare for a fresh run by changing Result to Loaded reset :: RunState -> IO () reset RunState{..} = runLocked database $ modifyAllMem database f where f (Ready r) = Loaded (snd <$> r) f (Failed _ x) = maybe Missing Loaded x f (Running _ x) = maybe Missing Loaded x -- shouldn't ever happen, but Loaded is least worst f x = x run :: RunState -> Bool -> [Action ()] -> IO [IO ()] run RunState{..} oneshot actions2 = withInit opts $ \opts@ShakeOptions{..} diagnostic output -> do -- timings are a bit delicate, we want to make sure we clear them before we leave (so each run is fresh) -- but we also want to only print them if there is no exception, and have to caputre them before we clear them -- we use this variable to stash them away, then print after the exception handling block timingsToShow <- newIORef Nothing res <- withCleanup $ \cleanup -> do register cleanup $ do when (shakeTimings && shakeVerbosity >= Info) $ writeIORef timingsToShow . Just =<< getTimings resetTimings start <- offsetTime except <- newIORef (Nothing :: Maybe (String, ShakeException)) let getFailure = fmap fst <$> readIORef except let raiseError err | not shakeStaunch = throwIO err | otherwise = do let named = shakeAbbreviationsApply opts . shakeExceptionTarget atomicModifyIORef except $ \v -> (Just $ fromMaybe (named err, err) v, ()) -- no need to print exceptions here, they get printed when they are wrapped after <- newIORef [] absent <- newIORef [] step <- incrementStep database getProgress <- usingProgress cleanup opts database step getFailure lintCurrentDirectory curdir "When running" watch <- lintWatch shakeLintWatch let ruleFinished | isJust shakeLint = \k -> do liftIO $ lintCurrentDirectory curdir $ show k lintTrackFinished liftIO $ watch $ show k | otherwise = liftIO . watch . show addTiming "Running rules" locals <- newIORef [] runPool (shakeThreads == 1) shakeThreads $ \pool -> do let global = Global applyKeyValue database pool cleanup start builtinRules output opts diagnostic ruleFinished after absent getProgress userRules shared cloud step oneshot -- give each action a stack to start with! forM_ (actions ++ map (emptyStack,) actions2) $ \(stack, act) -> do let local = newLocal stack shakeVerbosity addPool PoolStart pool $ runAction global local (act >> getLocal) $ \case Left e -> raiseError =<< shakeException global stack e Right local -> atomicModifyIORef_ locals (local:) whenJustM (readIORef except) (throwIO . snd) assertFinishedDatabase database let putWhen lvl msg = when (shakeVerbosity >= lvl) $ output lvl msg locals <- readIORef locals end <- start if null actions && null actions2 then putWhen Info "Warning: No want/action statements, nothing to do" else recordRoot step locals end database when (isJust shakeLint) $ do addTiming "Lint checking" lintCurrentDirectory curdir "After completion" checkValid diagnostic database (runLint builtinRules) =<< readIORef absent putWhen Verbose "Lint checking succeeded" when (shakeReport /= []) $ do addTiming "Profile report" forM_ shakeReport $ \file -> do putWhen Info $ "Writing report to " ++ file writeProfile file database when (shakeLiveFiles /= []) $ do addTiming "Listing live" diagnostic $ pure "Listing live keys" xs <- liveFiles database forM_ shakeLiveFiles $ \file -> do putWhen Info $ "Writing live list to " ++ file (if file == "-" then putStr else writeFile file) $ unlines xs res <- readIORef after addTiming "Cleanup" pure res whenJustM (readIORef timingsToShow) $ putStr . unlines pure res -- | Run a set of IO actions, treated as \"after\" actions, typically returned from -- 'Development.Shake.Database.shakeRunDatabase'. The actions will be run with diagnostics -- etc as specified in the 'ShakeOptions'. shakeRunAfter :: ShakeOptions -> [IO ()] -> IO () shakeRunAfter _ [] = pure () shakeRunAfter opts after = withInit opts $ \ShakeOptions{..} diagnostic _ -> do let n = show $ length after diagnostic $ pure $ "Running " ++ n ++ " after actions" (time, _) <- duration $ sequence_ $ reverse after when (shakeTimings && shakeVerbosity >= Info) $ putStrLn $ "(+ running " ++ show n ++ " after actions in " ++ showDuration time ++ ")" withInit :: ShakeOptions -> (ShakeOptions -> (IO String -> IO ()) -> (Verbosity -> String -> IO ()) -> IO a) -> IO a withInit opts act = withCleanup $ \cleanup -> do opts@ShakeOptions{..} <- usingShakeOptions cleanup opts (diagnostic, output) <- outputFunctions opts <$> newLock act opts diagnostic output usingShakeOptions :: Cleanup -> ShakeOptions -> IO ShakeOptions usingShakeOptions cleanup opts = do opts@ShakeOptions{..} <- if shakeThreads opts /= 0 then pure opts else do p <- getProcessorCount; pure opts{shakeThreads=p} when shakeLineBuffering $ usingLineBuffering cleanup usingNumCapabilities cleanup shakeThreads pure opts outputFunctions :: ShakeOptions -> Lock -> (IO String -> IO (), Verbosity -> String -> IO ()) outputFunctions opts@ShakeOptions{..} outputLock = (diagnostic, output) where outputLocked v msg = withLock outputLock $ shakeOutput v msg diagnostic | shakeVerbosity < Diagnostic = const $ pure () | otherwise = \act -> do v <- act; outputLocked Diagnostic $ "% " ++ v output v = outputLocked v . shakeAbbreviationsApply opts usingProgress :: Cleanup -> ShakeOptions -> Database -> Step -> IO (Maybe String) -> IO (IO Progress) usingProgress cleanup ShakeOptions{..} database step getFailure = do let getProgress = do failure <- getFailure stats <- progress database step pure stats{isFailure=failure} allocateThread cleanup $ shakeProgress getProgress pure getProgress checkShakeExtra :: Map.HashMap TypeRep Dynamic -> IO () checkShakeExtra mp = do let bad = [(k,t) | (k,v) <- Map.toList mp, let t = dynTypeRep v, t /= k] case bad of (k,t):xs -> throwIO $ errorStructured "Invalid Map in shakeExtra" [("Key",Just $ show k),("Value type",Just $ show t)] (if null xs then "" else "Plus " ++ show (length xs) ++ " other keys") _ -> pure () runLint :: Map.HashMap TypeRep BuiltinRule -> Key -> Value -> IO (Maybe String) runLint mp k v = case Map.lookup (typeKey k) mp of Nothing -> pure Nothing Just BuiltinRule{..} -> builtinLint k v assertFinishedDatabase :: Database -> IO () assertFinishedDatabase database = do -- if you have anyone Waiting, and are not exiting with an error, then must have a complex recursion (see #400) status <- getKeyValues database let bad = [key | (key, Running{}) <- status] when (bad /= []) $ throwM $ errorComplexRecursion (map show bad) liveFilesState :: RunState -> IO [FilePath] liveFilesState RunState{..} = liveFiles database profileState :: RunState -> FilePath -> IO () profileState RunState{..} file = writeProfile file database liveFiles :: Database -> IO [FilePath] liveFiles database = do status <- getKeyValues database let specialIsFileKey t = show (fst $ splitTyConApp t) == "FileQ" pure [show k | (k, Ready{}) <- status, specialIsFileKey $ typeKey k] errorsState :: RunState -> IO [(String, SomeException)] errorsState RunState{..} = do status <- getKeyValues database pure [(show k, e) | (k, Failed e _) <- status] checkValid :: (IO String -> IO ()) -> Database -> (Key -> Value -> IO (Maybe String)) -> [(Key, Key)] -> IO () checkValid diagnostic db check absent = do status <- getKeyValues db diagnostic $ pure "Starting validity/lint checking" -- TEST 1: Have values changed since being depended on -- Do not use a forM here as you use too much stack space bad <- (\f -> foldM f [] status) $ \seen v -> case v of (key, Ready Result{..}) -> do good <- check key $ fst result diagnostic $ pure $ "Checking if " ++ show key ++ " is " ++ show result ++ ", " ++ if isNothing good then "passed" else "FAILED" pure $ [(key, result, now) | Just now <- [good]] ++ seen _ -> pure seen unless (null bad) $ do let n = length bad throwM $ errorStructured ("Lint checking error - " ++ (if n == 1 then "value has" else show n ++ " values have") ++ " changed since being depended upon") (intercalate [("",Just "")] [ [("Key", Just $ show key),("Old", Just $ show result),("New", Just now)] | (key, result, now) <- bad]) "" -- TEST 2: Is anything from lintTrackWrite which promised not to exist actually been created exists <- getIdFromKey db bad <- pure [(parent,key) | (parent, key) <- Set.toList $ Set.fromList absent, isJust $ exists key] unless (null bad) $ do let n = length bad throwM $ errorStructured ("Lint checking error - " ++ (if n == 1 then "value" else show n ++ " values") ++ " did not have " ++ (if n == 1 then "its" else "their") ++ " creation tracked") (intercalate [("",Just "")] [ [("Rule", Just $ show parent), ("Created", Just $ show key)] | (parent,key) <- bad]) "" diagnostic $ pure "Validity/lint check passed" --------------------------------------------------------------------- -- STORAGE usingDatabase :: Cleanup -> ShakeOptions -> (IO String -> IO ()) -> Map.HashMap TypeRep BuiltinRule -> IO Database usingDatabase cleanup opts diagnostic owitness = do let step = (typeRep (Proxy :: Proxy StepKey), (Ver 0, BinaryOp (const mempty) (const stepKey))) let root = (typeRep (Proxy :: Proxy Root), (Ver 0, BinaryOp (const mempty) (const rootKey))) witness<- pure $ Map.fromList [ (QTypeRep t, (version, BinaryOp (putDatabase putOp) (getDatabase getOp))) | (t,(version, BinaryOp{..})) <- step : root : Map.toList (Map.map (\BuiltinRule{..} -> (builtinVersion, builtinKey)) owitness)] (status, journal) <- usingStorage cleanup opts diagnostic witness journal<- pure $ \i k v -> journal (QTypeRep $ typeKey k) i (k, v) createDatabase status journal Missing incrementStep :: Database -> IO Step incrementStep db = runLocked db $ do stepId <- mkId db stepKey v <- liftIO $ getKeyValueFromId db stepId step <- liftIO $ evaluate $ case v of Just (_, Loaded r) -> incStep $ fromStepResult r _ -> Step 1 let stepRes = toStepResult step setMem db stepId stepKey $ Ready stepRes liftIO $ setDisk db stepId stepKey $ Loaded $ fmap snd stepRes pure step toStepResult :: Step -> Result (Value, BS_Store) toStepResult i = Result (newValue i, runBuilder $ putEx i) i i [] 0 [] fromStepResult :: Result BS_Store -> Step fromStepResult = getEx . result recordRoot :: Step -> [Local] -> Seconds -> Database -> IO () recordRoot step locals (doubleToFloat -> end) db = runLocked db $ do rootId <- mkId db rootKey let local = localMergeMutable (newLocal emptyStack Info) locals let rootRes = Result {result = (newValue (), BS.empty) ,changed = step ,built = step ,depends = flattenDepends $ localDepends local ,execution = 0 ,traces = flattenTraces $ addTrace (localTraces local) $ Trace BS.empty end end} setMem db rootId rootKey $ Ready rootRes liftIO $ setDisk db rootId rootKey $ Loaded $ fmap snd rootRes loadSharedCloud :: DatabasePoly k v -> ShakeOptions -> Map.HashMap TypeRep BuiltinRule -> IO (Maybe Shared, Maybe Cloud) loadSharedCloud var opts owitness = do let mp = Map.fromList $ map (first $ show . QTypeRep) $ Map.toList owitness let wit = binaryOpMap $ \a -> maybe (error $ "loadSharedCloud, couldn't find map for " ++ show a) builtinKey $ Map.lookup a mp let wit2 = BinaryOp (\k -> putOp wit (show $ QTypeRep $ typeKey k, k)) (snd . getOp wit) let keyVers = [(k, builtinVersion v) | (k,v) <- Map.toList owitness] let ver = makeVer $ shakeVersion opts shared <- case shakeShare opts of Nothing -> pure Nothing Just x -> Just <$> newShared (shakeSymlink opts) wit2 ver x cloud <- case newCloud (runLocked var) (Map.map builtinKey owitness) ver keyVers $ shakeCloud opts of _ | null $ shakeCloud opts -> pure Nothing Nothing -> fail "shakeCloud set but Shake not compiled for cloud operation" Just res -> Just <$> res pure (shared, cloud) putDatabase :: (Key -> Builder) -> ((Key, Status) -> Builder) putDatabase putKey (key, Loaded (Result x1 x2 x3 x4 x5 x6)) = putExN (putKey key) <> putExN (putEx x1) <> putEx x2 <> putEx x3 <> putEx x5 <> putExN (putEx x4) <> putEx x6 putDatabase _ (_, x) = throwImpure $ errorInternal $ "putWith, Cannot write Status with constructor " ++ statusType x getDatabase :: (BS.ByteString -> Key) -> BS.ByteString -> (Key, Status) getDatabase getKey bs | (key, bs) <- getExN bs , (x1, bs) <- getExN bs , (x2, x3, x5, bs) <- binarySplit3 bs , (x4, x6) <- getExN bs = (getKey key, Loaded (Result x1 x2 x3 (getEx x4) x5 (getEx x6))) shake-0.19.8/src/Development/Shake/Internal/Core/Storage.hs0000644000000000000000000002564007346545000021614 0ustar0000000000000000{-# LANGUAGE ScopedTypeVariables, RecordWildCards, FlexibleInstances #-} {-# LANGUAGE BangPatterns, GeneralizedNewtypeDeriving #-} {- This module stores the meta-data so its very important its always accurate We can't rely on getting any exceptions or termination at the end, so we'd better write out a journal We store a series of records, and if they contain twice as many records as needed, we compact -} module Development.Shake.Internal.Core.Storage( usingStorage ) where import General.Chunks import General.Cleanup import General.Binary import General.Intern import Development.Shake.Internal.Options import Development.Shake.Internal.Errors import General.Timing import General.FileLock import qualified General.Ids as Ids import Control.Exception.Extra import Control.Monad.Extra import Data.Monoid import Data.Either.Extra import Data.Time import Data.Char import Data.Word import System.Info import Development.Shake.Classes import Numeric import General.Extra import Data.List.Extra import Data.Maybe import System.FilePath import qualified Data.ByteString.UTF8 as UTF8 import qualified Data.HashMap.Strict as Map import qualified Data.ByteString.Char8 as BS import qualified Data.ByteString as BS8 import Prelude -- Increment every time the on-disk format/semantics change, -- @x@ is for the users version number databaseVersion :: String -> String -- THINGS I WANT TO DO ON THE NEXT CHANGE -- * Change filepaths to store a 1 byte prefix saying 8bit ASCII or UTF8 -- * Duration and Time should be stored as number of 1/10000th seconds Int32 databaseVersion x = "SHAKE-DATABASE-14-" ++ os ++ "-" ++ arch ++ "-" ++ s ++ "\r\n" where s = tailErr $ init $ show x -- call show, then take off the leading/trailing quotes -- ensures we do not get \r or \n in the user portion messageCorrupt :: FilePath -> SomeException -> IO [String] messageCorrupt dbfile err = do msg <- showException err pure $ ("Error when reading Shake database " ++ dbfile) : map (" "++) (lines msg) ++ ["All files will be rebuilt"] messageDatabaseVersionChange :: FilePath -> BS.ByteString -> BS.ByteString -> [String] messageDatabaseVersionChange dbfile old new = ["Shake database version changed (either shake library version, or shakeVersion):" ," File: " ++ dbfile ," Old version: " ++ disp (limit $ BS.unpack old) ," New version: " ++ disp (BS.unpack new) ,"All rules will be rebuilt"] where limit x = let (a,b) = splitAt 200 x in a ++ (if null b then "" else "...") disp = map (\x -> if isPrint x && isAscii x then x else '?') . takeWhile (`notElem` ("\r\n" :: String)) messageMissingTypes :: FilePath -> [String] -> [String] messageMissingTypes dbfile types = ["Shake database rules have changed for the following types:" ," File: " ++ dbfile] ++ [" Type: " ++ x | x <- types] ++ ["All rules using these types will be rebuilt"] -- | Storage of heterogeneous things. In the particular case of Shake, -- k ~ QTypeRep, v ~ (Key, Status{Value}). -- -- The storage starts with a witness table saying what can be contained. -- If any entries in the witness table don't have a current Witness then a fake -- error witness is manufactured. If the witness ever changes the entire DB is -- rewritten. usingStorage :: (Show k, Eq k, Hashable k, NFData k, Show v, NFData v) => Cleanup -> ShakeOptions -- ^ Storage options -> (IO String -> IO ()) -- ^ Logging function -> Map.HashMap k (Ver, BinaryOp v) -- ^ Witnesses -> IO (Ids.Ids v, k -> Id -> v -> IO ()) usingStorage _ ShakeOptions{..} diagnostic _ | shakeFiles == "/dev/null" = do diagnostic $ pure "Using in-memory database" ids <- Ids.empty pure (ids, \_ _ _ -> pure ()) usingStorage cleanup ShakeOptions{..} diagnostic witness = do let lockFile = shakeFiles ".shake.lock" diagnostic $ pure $ "Before usingLockFile on " ++ lockFile usingLockFile cleanup lockFile diagnostic $ pure "After usingLockFile" let dbfile = shakeFiles ".shake.database" createDirectoryRecursive shakeFiles -- complete a partially failed compress whenM (restoreChunksBackup dbfile) $ do unexpected "Backup file exists, restoring over the previous file\n" diagnostic $ pure "Backup file move to original" addTiming "Database read" h <- usingChunks cleanup dbfile shakeFlush let corrupt | not shakeStorageLog = resetChunksCorrupt Nothing h | otherwise = do let file = dbfile <.> "corrupt" resetChunksCorrupt (Just file) h unexpected $ "Backup of corrupted file stored at " ++ file ++ "\n" -- check the version information matches let ver = BS.pack $ databaseVersion shakeVersion oldVer <- readChunkMax h $ fromIntegral $ BS.length ver + 100000 let verEq = Right ver == oldVer when (not shakeVersionIgnore && not verEq && oldVer /= Left BS.empty) $ do outputErr $ messageDatabaseVersionChange dbfile (fromEither oldVer) ver corrupt (!witnessNew, !save) <- evaluate $ saveWitness witness witnessOld <- readChunk h ids <- case witnessOld of Left _ -> do resetChunksCorrupt Nothing h pure Nothing Right witnessOld -> handleBool (not . isAsyncException) (\err -> do outputErr =<< messageCorrupt dbfile err corrupt pure Nothing) $ do (!missing, !load) <- evaluate $ loadWitness witness witnessOld when (missing /= []) $ outputErr $ messageMissingTypes dbfile missing ids <- Ids.empty let raw bs = "[len " ++ show (BS.length bs) ++ "] " ++ concat [['0' | length c == 1] ++ c | x <- BS8.unpack bs, let c = showHex x ""] let go !i = do v <- readChunk h case v of Left e -> do let slop = fromIntegral $ BS.length e when (slop > 0) $ unexpected $ "Last " ++ show slop ++ " bytes do not form a whole record\n" diagnostic $ pure $ "Read " ++ show i ++ " chunks, plus " ++ show slop ++ " slop" pure i Right bs | (id, Just (k,v)) <- load bs -> do evaluate $ rnf k evaluate $ rnf v Ids.insert ids id (k,v) diagnostic $ do let pretty (Left x) = "FAILURE: " ++ show x pretty (Right x) = x x2 <- try_ $ evaluate $ let s = show v in rnf s `seq` s pure $ "Chunk " ++ show i ++ " " ++ raw bs ++ " " ++ show id ++ " = " ++ pretty x2 go $ i+1 Right bs -> do diagnostic $ pure $ "Chunk " ++ show i ++ " " ++ raw bs ++ " UNKNOWN WITNESS" go i countItems <- go 0 countDistinct <- Ids.sizeUpperBound ids diagnostic $ pure $ "Found at most " ++ show countDistinct ++ " distinct entries out of " ++ show countItems when (countItems > countDistinct*2 || not verEq || witnessOld /= witnessNew) $ do addTiming "Database compression" resetChunksCompact h $ \out -> do out $ putEx ver out $ putEx witnessNew Ids.forWithKeyM_ ids $ \i (k,v) -> out $ save k i v Just <$> Ids.forCopy ids snd ids <- case ids of Just ids -> pure ids Nothing -> do writeChunk h $ putEx ver writeChunk h $ putEx witnessNew Ids.empty addTiming "With database" out <- usingWriteChunks cleanup h pure (ids, \k i v -> out $ save k i v) where unexpected x = when shakeStorageLog $ do t <- getCurrentTime appendFile (shakeFiles ".shake.storage.log") $ "\n[" ++ show t ++ "]: " ++ trimEnd x ++ "\n" outputErr x = do when (shakeVerbosity >= Warn) $ shakeOutput Warn $ unlines x unexpected $ unlines x -- | A list oft witnesses, saved type Witnesses = BS.ByteString -- | The version and key, serialised newtype Witness = Witness BS.ByteString deriving (Eq, Hashable, Ord) toWitness :: Show k => Ver -> k -> Witness toWitness (Ver v) k = Witness $ UTF8.fromString (show k ++ (if v == 0 then "" else ", v" ++ show v)) instance BinaryEx [Witness] where putEx xs = putEx [x | Witness x <- xs] getEx = map Witness . getEx -- | Given the current witness table, and the serialised one from last time, return -- (witnesses that got removed, way to deserialise an entry into an Id, and (if the witness remains) the key and value) loadWitness :: forall k v . Show k => Map.HashMap k (Ver, BinaryOp v) -> Witnesses -> ([String], BS.ByteString -> (Id, Maybe (k, v))) loadWitness mp bs = (,) missing $ seq ind $ \bs -> let (wInd :: Word16, i :: Id, bs2) = binarySplit2 bs in case ind (fromIntegral wInd) of Nothing -> throwImpure $ errorInternal $ "Witness index out of bounds, " ++ show wInd Just f -> (i, f bs2) where ws :: [Witness] = getEx bs missing = [UTF8.toString w | (i, Witness w) <- zipFrom 0 ws, isNothing $ fromJust (ind i) BS.empty] mp2 :: Map.HashMap Witness (k, BinaryOp v) = Map.fromList [(toWitness ver k, (k, bin)) | (k,(ver,bin)) <- Map.toList mp] ind :: (Int -> Maybe (BS.ByteString -> Maybe (k, v))) = seq mp2 $ fastAt $ flip map ws $ \w -> case Map.lookup w mp2 of Nothing -> const Nothing Just (k, BinaryOp{..}) -> \bs -> Just (k, getOp bs) saveWitness :: forall k v . (Eq k, Hashable k, Show k) => Map.HashMap k (Ver, BinaryOp v) -> (Witnesses, k -> Id -> v -> Builder) saveWitness mp | Map.size mp > fromIntegral (maxBound :: Word16) = throwImpure $ errorInternal $ "Number of distinct witness types exceeds limit, got " ++ show (Map.size mp) | otherwise = (runBuilder $ putEx ws ,mpSave `seq` \k -> fromMaybe (throwImpure $ errorInternal $ "Don't know how to save, " ++ show k) $ Map.lookup k mpSave) where -- the entries in the witness table (in a stable order, to make it more likely to get a good equality) ws :: [Witness] = sort $ map (\(k,(ver,_)) -> toWitness ver k) $ Map.toList mp -- an index for each of the witness entries wsIndex :: Map.HashMap Witness Word16 = Map.fromList $ zip ws [0 :: Word16 ..] -- the save functions mpSave :: Map.HashMap k (Id -> v -> Builder) = flip Map.mapWithKey mp $ \k (ver,BinaryOp{..}) -> let tag = putEx $ wsIndex Map.! toWitness ver k in \(Id w) v -> tag <> putEx w <> putOp v shake-0.19.8/src/Development/Shake/Internal/Core/Types.hs0000644000000000000000000005235007346545000021312 0ustar0000000000000000{-# LANGUAGE GeneralizedNewtypeDeriving, ScopedTypeVariables, DeriveDataTypeable, ViewPatterns #-} {-# LANGUAGE ExistentialQuantification, DeriveFunctor, RecordWildCards, FlexibleInstances #-} module Development.Shake.Internal.Core.Types( BuiltinRun, BuiltinLint, BuiltinIdentity, RunMode(..), RunResult(..), RunChanged(..), UserRule(..), UserRuleVersioned(..), userRuleSize, BuiltinRule(..), Global(..), Local(..), Action(..), runAction, addDiscount, newLocal, localClearMutable, localMergeMutable, Traces, newTrace, addTrace, flattenTraces, DependsList, flattenDepends, enumerateDepends, addDepends, addDepends1, newDepends, Stack, Step(..), Result(..), Database, DatabasePoly(..), Depends(..), Status(..), Trace(..), BS_Store, getResult, exceptionStack, statusType, addStack, addCallStack, incStep, emptyStack, topStack, showTopStack, stepKey, StepKey(..), rootKey, Root(..) ) where import Control.Monad.IO.Class import Control.DeepSeq import Foreign.Storable import Data.Word import Data.Typeable import General.Binary import Data.Maybe import Data.List import Control.Exception import General.Extra import Development.Shake.Internal.Core.Database import Development.Shake.Internal.History.Shared import Development.Shake.Internal.History.Cloud import Development.Shake.Internal.History.Types import Development.Shake.Internal.Errors import qualified General.TypeMap as TMap import Data.IORef import qualified Data.ByteString.Char8 as BS import Numeric.Extra import System.Time.Extra import General.Intern(Id) import qualified Data.HashSet as Set import qualified Data.HashMap.Strict as Map import Data.Tuple.Extra import General.Pool import Development.Shake.Internal.Core.Monad import Development.Shake.Internal.Value import Development.Shake.Internal.Options import Development.Shake.Classes import Data.Semigroup import General.Cleanup import Control.Monad.Fail import Prelude --------------------------------------------------------------------- -- UNDERLYING DATA TYPE -- | The 'Action' monad, use 'liftIO' to raise 'IO' actions into it, and 'Development.Shake.need' to execute files. -- Action values are used by 'addUserRule' and 'action'. The 'Action' monad tracks the dependencies of a rule. -- To raise an exception call 'error', 'fail' or @'liftIO' . 'throwIO'@. -- -- The 'Action' type is both a 'Monad' and 'Applicative'. Anything that is depended upon applicatively -- will have its dependencies run in parallel. For example @'need' [\"a\"] *> 'need [\"b\"]@ is equivalent -- to @'need' [\"a\", \"b\"]@. newtype Action a = Action {fromAction :: RAW ([String],[Key]) [Value] Global Local a} deriving (Functor, Applicative, Monad, MonadIO, Typeable, Semigroup, Monoid, MonadFail) runAction :: Global -> Local -> Action a -> Capture (Either SomeException a) runAction g l (Action x) = runRAW (fromAction . build) g l x where -- first argument is a list of call stacks, since build only takes one we use the first -- they are very probably all identical... build :: [([String], [Key])] -> Action [[Value]] build [] = pure [] build ks@((callstack,_):_) = do let kss = map snd ks unconcat kss <$> globalBuild g callstack (concat kss) --------------------------------------------------------------------- -- PUBLIC TYPES -- | What mode a rule is running in, passed as an argument to 'BuiltinRun'. data RunMode = RunDependenciesSame -- ^ My dependencies have not changed. | RunDependenciesChanged -- ^ At least one of my dependencies from last time have changed, or I have no recorded dependencies. deriving (Eq,Show) instance NFData RunMode where rnf x = x `seq` () -- | How the output of a rule has changed. data RunChanged = ChangedNothing -- ^ Nothing has changed. | ChangedStore -- ^ The stored value has changed, but in a way that should be considered identical (used rarely). | ChangedRecomputeSame -- ^ I recomputed the value and it was the same. | ChangedRecomputeDiff -- ^ I recomputed the value and it was different. deriving (Eq,Show) instance NFData RunChanged where rnf x = x `seq` () -- | The result of 'BuiltinRun'. data RunResult value = RunResult {runChanged :: RunChanged -- ^ How has the 'RunResult' changed from what happened last time. ,runStore :: BS.ByteString -- ^ The value to store in the Shake database. ,runValue :: value -- ^ The value to return from 'Development.Shake.Rule.apply'. } deriving Functor instance NFData value => NFData (RunResult value) where rnf (RunResult x1 x2 x3) = rnf x1 `seq` x2 `seq` rnf x3 --------------------------------------------------------------------- -- UTILITY TYPES newtype Step = Step Word32 deriving (Eq,Ord,Show,Storable,BinaryEx,NFData,Hashable,Typeable) incStep (Step i) = Step $ i + 1 -- To simplify journaling etc we smuggle the Step in the database, with a special StepKey newtype StepKey = StepKey () deriving (Show,Eq,Typeable,Hashable,Binary,BinaryEx,NFData) stepKey :: Key stepKey = newKey $ StepKey () -- To make sure profiling has a complete view of what was demanded and all top-level 'action' -- things we fake up a Root node representing everything that was demanded newtype Root = Root () deriving (Eq,Typeable,Hashable,Binary,BinaryEx,NFData) instance Show Root where show (Root ()) = "Root" rootKey :: Key rootKey = newKey $ Root () --------------------------------------------------------------------- -- CALL STACK -- Invariant: Every key must have its Id in the set data Stack = Stack (Maybe Key) [Either Key [String]] !(Set.HashSet Id) deriving Show exceptionStack :: Stack -> SomeException -> ShakeException exceptionStack stack@(Stack _ xs1 _) (callStackFromException -> (xs2, e)) = ShakeException (showTopStack stack) (xs ++ ["* Raised the exception:" | not $ null xs]) e where xs = concatMap f $ reverse xs1 ++ [Right xs2] f (Left x) = ["* Depends on: " ++ show x] f (Right x) = map (" at " ++) x showTopStack :: Stack -> String showTopStack = maybe "" show . topStack addStack :: Id -> Key -> Stack -> Either SomeException Stack addStack i k (Stack _ ks is) | i `Set.member` is = Left $ toException $ exceptionStack stack2 $ errorRuleRecursion (typeKey k) (show k) | otherwise = Right stack2 where stack2 = Stack (Just k) (Left k:ks) (Set.insert i is) addCallStack :: [String] -> Stack -> Stack -- use group/head to squash adjacent duplicates, e.g. a want does an action and a need, both of which get the same location addCallStack xs (Stack t a b) = Stack t (Right xs : dropWhile (== Right xs) a) b topStack :: Stack -> Maybe Key topStack (Stack t _ _) = t emptyStack :: Stack emptyStack = Stack Nothing [] Set.empty --------------------------------------------------------------------- -- TRACE data Trace = Trace {traceMessage :: {-# UNPACK #-} !BS.ByteString ,traceStart :: {-# UNPACK #-} !Float ,traceEnd :: {-# UNPACK #-} !Float } deriving Show instance NFData Trace where rnf x = x `seq` () -- all strict atomic fields instance BinaryEx Trace where putEx (Trace a b c) = putEx b <> putEx c <> putEx a getEx x | (b,c,a) <- binarySplit2 x = Trace a b c instance BinaryEx [Trace] where putEx = putExList . map putEx getEx = map getEx . getExList newTrace :: String -> Seconds -> Seconds -> Trace newTrace msg start stop = Trace (BS.pack msg) (doubleToFloat start) (doubleToFloat stop) --------------------------------------------------------------------- -- CENTRAL TYPES -- Things stored under OneShot are not required if we only do one compilation, -- but are if we do multiple, as we have to reset the database each time. -- globalOneShot controls that, and gives us a small memory optimisation. type OneShot a = a data Status = Ready !(Result (Value, OneShot BS_Store)) -- ^ I have a value | Failed !SomeException !(OneShot (Maybe (Result BS_Store))) -- ^ I have been run and raised an error | Loaded !(Result BS_Store) -- ^ Loaded from the database | Running !(NoShow (Either SomeException (Result (Value, BS_Store)) -> Locked ())) (Maybe (Result BS_Store)) -- ^ Currently in the process of being checked or built | Missing -- ^ I am only here because I got into the Intern table deriving Show instance NFData Status where rnf x = case x of Ready x -> rnf x Failed x y -> rnfException x `seq` rnf y Loaded x -> rnf x Running _ x -> rnf x -- Can't RNF a waiting, but also unnecessary Missing -> () where -- best we can do for an arbitrary exception rnfException = rnf . show data Result a = Result {result :: !a -- ^ the result associated with the Key ,built :: {-# UNPACK #-} !Step -- ^ when it was actually run ,changed :: {-# UNPACK #-} !Step -- ^ the step for deciding if it's valid ,depends :: ![Depends] -- ^ dependencies (don't run them early) ,execution :: {-# UNPACK #-} !Float -- ^ how long it took when it was last run (seconds) ,traces :: ![Trace] -- ^ a trace of the expensive operations (start/end in seconds since beginning of run) } deriving (Show,Functor) instance NFData a => NFData (Result a) where -- ignore unpacked fields rnf (Result a _ _ b _ c) = rnf a `seq` rnf b `seq` rnf c statusType Ready{} = "Ready" statusType Failed{} = "Failed" statusType Loaded{} = "Loaded" statusType Running{} = "Running" statusType Missing{} = "Missing" getResult :: Status -> Maybe (Result (Either BS_Store Value)) getResult (Ready r) = Just $ Right . fst <$> r getResult (Loaded r) = Just $ Left <$> r getResult (Running _ r) = fmap Left <$> r getResult _ = Nothing --------------------------------------------------------------------- -- OPERATIONS newtype Depends = Depends {fromDepends :: [Id]} deriving (NFData, Semigroup, Monoid) instance Show Depends where -- Appears in diagnostic output and the Depends ctor is just verbose show = show . fromDepends instance BinaryEx Depends where putEx (Depends xs) = putExStorableList xs getEx = Depends . getExStorableList instance BinaryEx [Depends] where putEx = putExList . map putEx getEx = map getEx . getExList data DependsList = DependsNone | DependsDirect [Depends] | DependsSequence DependsList DependsList | DependsSequence1 DependsList Depends | DependsParallel [DependsList] -- Create a new set of depends, from a list in the right order newDepends :: [Depends] -> DependsList newDepends = DependsDirect -- Add two sequences of dependencies in order addDepends :: DependsList -> DependsList -> DependsList addDepends = DependsSequence addDepends1 :: DependsList -> Depends -> DependsList addDepends1 = DependsSequence1 -- Two goals here, merge parallel lists so they retain as much leading parallelism as possible -- Afterwards each Id must occur at most once and there are no empty Depends flattenDepends :: DependsList -> [Depends] flattenDepends d = fMany Set.empty $ flat d [] where flat :: DependsList -> [Depends] -> [Depends] flat DependsNone rest = rest flat (DependsDirect xs) rest = xs ++ rest flat (DependsSequence xs ys) rest = flat xs $ flat ys rest flat (DependsSequence1 xs y) rest = flat xs $ y:rest -- for each element of xs, we want to pull off the things that must be done first -- and then the stuff that can be done later flat (DependsParallel xs) rest = map mconcat xss ++ rest where xss = transpose $ map (`flat` []) xs fMany _ [] = [] fMany seen (Depends d:ds) = [Depends d2 | d2 /= []] ++ fMany seen2 ds where (d2,seen2) = fOne seen d fOne seen [] = ([], seen) fOne seen (x:xs) | x `Set.member` seen = fOne seen xs fOne seen (x:xs) = first (x:) $ fOne (Set.insert x seen) xs -- List all the dependencies in whatever order you wish, used for linting enumerateDepends :: DependsList -> [Depends] enumerateDepends d = f d [] where f DependsNone rest = rest f (DependsDirect xs) rest = xs ++ rest f (DependsSequence xs ys) rest = f xs $ f ys rest f (DependsSequence1 xs y) rest = f xs (y:rest) f (DependsParallel []) rest = rest f (DependsParallel (x:xs)) rest = f x $ f (DependsParallel xs) rest -- | Define a rule between @key@ and @value@. As an example, a typical 'BuiltinRun' will look like: -- -- > run key oldStore mode = do -- > ... -- > pure $ RunResult change newStore newValue -- -- Where you have: -- -- * @key@, how to identify individual artifacts, e.g. with file names. -- -- * @oldStore@, the value stored in the database previously, e.g. the file modification time. -- -- * @mode@, either 'RunDependenciesSame' (none of your dependencies changed, you can probably not rebuild) or -- 'RunDependenciesChanged' (your dependencies changed, probably rebuild). -- -- * @change@, usually one of either 'ChangedNothing' (no work was required) or 'ChangedRecomputeDiff' -- (I reran the rule and it should be considered different). -- -- * @newStore@, the new value to store in the database, which will be passed in next time as @oldStore@. -- -- * @newValue@, the result that 'Development.Shake.Rule.apply' will return when asked for the given @key@. type BuiltinRun key value = key -> Maybe BS.ByteString -> RunMode -> Action (RunResult value) -- | The action performed by @--lint@ for a given @key@/@value@ pair. -- At the end of the build the lint action will be called for each @key@ that was built this run, -- passing the @value@ it produced. Return 'Nothing' to indicate the value has not changed and -- is acceptable, or 'Just' an error message to indicate failure. -- -- For builtin rules where the value is expected to change, or has no useful checks to perform. -- use 'Development.Shake.Rules.noLint'. type BuiltinLint key value = key -> value -> IO (Maybe String) -- | Produce an identity for a @value@ that can be used to do direct equality. If you have a custom -- notion of equality then the result should return only one member from each equivalence class, -- as values will be compared for literal equality. -- The result of the identity should be reasonably short (if it is excessively long, hash it). -- -- For rules where the value is never compatible use 'Development.Shake.Rules.noIdentity', which -- returns 'Nothing'. This will disable shared caches of anything that depends on it. type BuiltinIdentity key value = key -> value -> Maybe BS.ByteString data BuiltinRule = BuiltinRule {builtinLint :: BuiltinLint Key Value ,builtinIdentity :: BuiltinIdentity Key Value ,builtinRun :: BuiltinRun Key Value ,builtinKey :: BinaryOp Key ,builtinVersion :: Ver ,builtinLocation :: String } -- | A 'UserRule' data type, representing user-defined rules associated with a particular type. -- As an example 'Development.Shake.?>' and 'Development.Shake.%>' will add entries to the 'UserRule' data type. data UserRule a -- > priority p1 (priority p2 x) == priority p1 x -- > priority p (x `ordered` y) = priority p x `ordered` priority p y -- > priority p (x `unordered` y) = priority p x `unordered` priority p y -- > ordered is associative -- > unordered is associative and commutative -- > alternative does not obey priorities, until picking the best one = UserRule a -- ^ Added to the state with @'addUserRule' :: Typeable a => a -> 'Rules' ()@. | Unordered [UserRule a] -- ^ Rules combined with the 'Monad' \/ 'Monoid'. | Priority Double (UserRule a) -- ^ Rules defined under 'priority'. | Alternative (UserRule a) -- ^ Rule defined under 'alternatives', matched in order. | Versioned Ver (UserRule a) -- ^ Rule defined under 'versioned', attaches a version. deriving (Eq,Show,Functor,Typeable) data UserRuleVersioned a = UserRuleVersioned {userRuleVersioned :: Bool -- ^ Does Versioned exist anywhere within userRuleContents ,userRuleContents :: UserRule a -- ^ The actual rules } instance Semigroup (UserRuleVersioned a) where UserRuleVersioned b1 x1 <> UserRuleVersioned b2 x2 = UserRuleVersioned (b1 || b2) (x1 <> x2) instance Monoid (UserRuleVersioned a) where mempty = UserRuleVersioned False mempty mappend = (<>) instance Semigroup (UserRule a) where x <> y = Unordered [x,y] instance Monoid (UserRule a) where mempty = Unordered [] mappend = (<>) userRuleSize :: UserRule a -> Int userRuleSize UserRule{} = 1 userRuleSize (Unordered xs) = sum $ map userRuleSize xs userRuleSize (Priority _ x) = userRuleSize x userRuleSize (Alternative x) = userRuleSize x userRuleSize (Versioned _ x) = userRuleSize x type Database = DatabasePoly Key Status -- global constants of Action data Global = Global {globalBuild :: [String] -> [Key] -> Action [Value] ,globalDatabase :: Database -- ^ Database, contains knowledge of the state of each key ,globalPool :: Pool -- ^ Pool, for queuing new elements ,globalCleanup :: Cleanup -- ^ Cleanup operations ,globalTimestamp :: IO Seconds -- ^ Clock saying how many seconds through the build ,globalRules :: Map.HashMap TypeRep BuiltinRule -- ^ Rules for this build ,globalOutput :: Verbosity -> String -> IO () -- ^ Output function ,globalOptions :: ShakeOptions -- ^ Shake options ,globalDiagnostic :: IO String -> IO () -- ^ Debugging function ,globalRuleFinished :: Key -> Action () -- ^ actions to run after each rule ,globalAfter :: IORef [IO ()] -- ^ Operations to run on success, e.g. removeFilesAfter ,globalTrackAbsent :: IORef [(Key, Key)] -- ^ Tracked things, in rule fst, snd must be absent ,globalProgress :: IO Progress -- ^ Request current progress state ,globalUserRules :: TMap.Map UserRuleVersioned ,globalShared :: Maybe Shared -- ^ The active shared state, if any ,globalCloud :: Maybe Cloud ,globalStep :: {-# UNPACK #-} !Step ,globalOneShot :: Bool -- ^ I am running in one-shot mode so don't need to store BS's for Result/Failed } -- local variables of Action data Local = Local -- constants {localStack :: Stack -- ^ The stack that ran to get here. ,localBuiltinVersion :: Ver -- ^ The builtinVersion of the rule you are running -- stack scoped local variables ,localVerbosity :: Verbosity -- ^ Verbosity, may be changed locally ,localBlockApply :: Maybe String -- ^ Reason to block apply, or Nothing to allow -- mutable local variables ,localDepends :: DependsList -- ^ Dependencies that we rely on, morally a list of sets ,localDiscount :: !Seconds -- ^ Time spend building dependencies (may be negative for parallel) ,localTraces :: Traces -- ^ Traces that have occurred ,localTrackAllows :: [Key -> Bool] -- ^ Things that are allowed to be used ,localTrackRead :: [Key] -- ^ Calls to 'lintTrackRead' ,localTrackWrite :: [Key] -- ^ Calls to 'lintTrackWrite' ,localProduces :: [(Bool, FilePath)] -- ^ Things this rule produces, True to check them ,localHistory :: !Bool -- ^ Is it valid to cache the result } data Traces = TracesNone -- no traces | TracesSequence1 Traces Trace -- Like TracesSequence but with 1 element | TracesSequence Traces Traces -- first the Traces happened, then Traces that happened after | TracesParallel [Traces] -- these traces happened in parallel with each other flattenTraces :: Traces -> [Trace] flattenTraces t = f t [] where f TracesNone rest = rest f (TracesSequence1 a b) rest = f a (b:rest) f (TracesSequence a b) rest = f a $ f b rest f (TracesParallel []) rest = rest -- Might want to resort them by time started? f (TracesParallel (x:xs)) rest = f x $ f (TracesParallel xs) rest addTrace :: Traces -> Trace -> Traces addTrace ts t = ts `TracesSequence1` t addDiscount :: Seconds -> Local -> Local addDiscount s l = l{localDiscount = s + localDiscount l} newLocal :: Stack -> Verbosity -> Local newLocal stack verb = Local stack (Ver 0) verb Nothing DependsNone 0 TracesNone [] [] [] [] True -- Clear all the local mutable variables localClearMutable :: Local -> Local localClearMutable Local{..} = (newLocal localStack localVerbosity){localBlockApply=localBlockApply, localBuiltinVersion=localBuiltinVersion} -- Merge, works well assuming you clear the variables first with localClearMutable. -- Assume the first was run sequentially, and the list in parallel. localMergeMutable :: Local -> [Local] -> Local -- don't construct with RecordWildCards so any new fields raise an error localMergeMutable root xs = Local -- immutable/stack that need copying {localStack = localStack root ,localBuiltinVersion = localBuiltinVersion root ,localVerbosity = localVerbosity root ,localBlockApply = localBlockApply root -- mutable locals that need integrating -- note that a lot of the lists are stored in reverse, assume root happened first ,localDepends = DependsParallel (map localDepends xs) `DependsSequence` localDepends root ,localDiscount = sum $ map localDiscount $ root : xs ,localTraces = TracesParallel (map localTraces xs) `TracesSequence` localTraces root ,localTrackAllows = localTrackAllows root ++ concatMap localTrackAllows xs ,localTrackRead = localTrackRead root ++ concatMap localTrackRead xs ,localTrackWrite = localTrackWrite root ++ concatMap localTrackWrite xs ,localProduces = concatMap localProduces xs ++ localProduces root ,localHistory = all localHistory $ root:xs } shake-0.19.8/src/Development/Shake/Internal/Demo.hs0000644000000000000000000001003607346545000020175 0ustar0000000000000000 -- | Demo tutorial, accessed with --demo module Development.Shake.Internal.Demo(demo) where import Development.Shake.Internal.Paths import Development.Shake.Command import Control.Exception.Extra import Control.Monad import Data.List.Extra import Data.Maybe import System.Directory import System.Exit import System.FilePath import General.Extra import Development.Shake.FilePath(exe) import System.IO import System.Info.Extra demo :: Bool -> IO () demo auto = do hSetBuffering stdout NoBuffering putStrLn $ "% Welcome to the Shake v" ++ shakeVersionString ++ " demo mode!" putStr "% Detecting machine configuration... " hasManual <- hasManualData ghc <- isJust <$> findExecutable "ghc" (gcc, gccPath) <- findGcc shakeLib <- wrap $ fmap (not . null . words . fromStdout) (cmd ("ghc-pkg list --simple-output shake" :: String)) ninja <- findExecutable "ninja" putStrLn "done\n" let path = if isWindows then "%PATH%" else "$PATH" require ghc $ "% You don't have 'ghc' on your " ++ path ++ ", which is required to run the demo." require gcc $ "% You don't have 'gcc' on your " ++ path ++ ", which is required to run the demo." require shakeLib "% You don't have the 'shake' library installed with GHC, which is required to run the demo." require hasManual "% You don't have the Shake data files installed, which are required to run the demo." empty <- all (all (== '.')) <$> getDirectoryContents "." dir <- if empty then getCurrentDirectory else do home <- getHomeDirectory dir <- getDirectoryContents home pure $ home headErr (map ("shake-demo" ++) ("":map show [2..]) \\ dir) putStrLn "% The Shake demo uses an empty directory, OK to use:" putStrLn $ "% " ++ dir b <- yesNo auto require b "% Please create an empty directory to run the demo from, then run 'shake --demo' again." putStr "% Copying files... " copyManualData dir unless isWindows $ do p <- getPermissions $ dir "build.sh" setPermissions (dir "build.sh") p{executable=True} putStrLn "done" let pause = do putStr "% Press ENTER to continue: " if auto then putLine "" else getLine let execute x = do putStrLn $ "% RUNNING: " ++ x cmd (Cwd dir) (AddPath [] (maybeToList gccPath)) Shell x :: IO () let build = if isWindows then "build" else "./build.sh" putStrLn "\n% [1/5] Building an example project with Shake." pause putStrLn $ "% RUNNING: cd " ++ dir execute build putStrLn "\n% [2/5] Running the produced example." pause execute $ "_build" "run" <.> exe putStrLn "\n% [3/5] Rebuilding an example project with Shake (nothing should change)." pause execute build putStrLn "\n% [4/5] Cleaning the build." pause execute $ build ++ " clean" putStrLn "\n% [5/5] Rebuilding with 2 threads and profiling." pause execute $ build ++ " -j2 --report --report=-" putStrLn "\n% See the profiling summary above, or look at the HTML profile report in" putStrLn $ "% " ++ dir "report.html" putStrLn "\n% Demo complete - all the examples can be run from:" putStrLn $ "% " ++ dir putStrLn "% For more info see https://shakebuild.com" when (isJust ninja) $ do putStrLn "\n% PS. Shake can also execute Ninja build files" putStrLn "% For more info see https://shakebuild.com/ninja" -- | Require the user to press @y@ before continuing. yesNo :: Bool -> IO Bool yesNo auto = do putStr "% [Y/N] (then ENTER): " x <- if auto then putLine "y" else lower <$> getLine if "y" `isPrefixOf` x then pure True else if "n" `isPrefixOf` x then pure False else yesNo auto putLine :: String -> IO String putLine x = putStrLn x >> pure x -- | Replace exceptions with 'False'. wrap :: IO Bool -> IO Bool wrap act = act `catch_` const (pure False) -- | Require a condition to be true, or exit with a message. require :: Bool -> String -> IO () require b msg = unless b $ putStrLn msg >> exitFailure shake-0.19.8/src/Development/Shake/Internal/Derived.hs0000644000000000000000000003226407346545000020702 0ustar0000000000000000{-# LANGUAGE ScopedTypeVariables #-} module Development.Shake.Internal.Derived( copyFile', copyFileChanged, readFile', readFileLines, writeFile', writeFileLines, writeFileChanged, withTempFile, withTempDir, withTempFileWithin, withTempDirWithin, getHashedShakeVersion, getShakeExtra, getShakeExtraRules, addShakeExtra, par, forP, newResource, newThrottle, withResources, newCache ) where import Control.Monad.Extra import Control.Monad.IO.Class import System.Directory import System.FilePath (takeDirectory) import System.IO (IOMode (..), hGetContents, withFile) import qualified System.IO.Extra as IO import Development.Shake.Internal.Errors import Development.Shake.Internal.Resource import Development.Shake.Internal.Core.Types import Development.Shake.Internal.Core.Action import Development.Shake.Internal.Core.Rules import Development.Shake.Internal.Options import Development.Shake.Internal.Rules.File import qualified Data.ByteString as BS import qualified Data.HashMap.Strict as Map import General.Extra import Data.List.Extra import Data.Hashable import Data.Typeable import Data.Dynamic -- | Get a checksum of a list of files, suitable for using as `shakeVersion`. -- This will trigger a rebuild when the Shake rules defined in any of the files are changed. -- For example: -- -- @ -- main = do -- ver <- 'getHashedShakeVersion' [\"Shakefile.hs\"] -- 'shakeArgs' 'shakeOptions'{'shakeVersion' = ver} ... -- @ -- -- To automatically detect the name of the current file, turn on the @TemplateHaskell@ -- extension and write @$(LitE . StringL . loc_filename \<$\> location)@. -- -- This feature can be turned off during development by passing -- the flag @--no-rule-version@ or setting 'shakeVersionIgnore' to 'True'. getHashedShakeVersion :: [FilePath] -> IO String getHashedShakeVersion files = do hashes <- mapM (fmap (hashWithSalt 0) . BS.readFile) files pure $ "hash-" ++ show (hashWithSalt 0 hashes) -- | Get an item from 'shakeExtra', using the requested type as the key. Fails -- if the value found at this key does not match the requested type. getShakeExtra :: Typeable a => Action (Maybe a) getShakeExtra = liftIO . lookupShakeExtra . shakeExtra =<< getShakeOptions -- | A version of 'getShakeExtra' in 'Rules'. getShakeExtraRules :: Typeable a => Rules (Maybe a) getShakeExtraRules = liftIO . lookupShakeExtra . shakeExtra =<< getShakeOptionsRules lookupShakeExtra :: forall a . Typeable a => Map.HashMap TypeRep Dynamic -> IO (Maybe a) lookupShakeExtra mp = case Map.lookup want mp of Just dyn | Just x <- fromDynamic dyn -> pure $ Just x | otherwise -> throwM $ errorStructured "shakeExtra value is malformed, all keys and values must agree" [("Key", Just $ show want) ,("Value", Just $ show $ dynTypeRep dyn)] "Use addShakeExtra to ensure shakeExtra is well-formed" Nothing -> pure Nothing where want = typeRep (Proxy :: Proxy a) -- | Add a properly structued value to 'shakeExtra' which can be retrieved with 'getShakeExtra'. addShakeExtra :: Typeable a => a -> Map.HashMap TypeRep Dynamic -> Map.HashMap TypeRep Dynamic addShakeExtra x = Map.insert (typeOf x) (toDyn x) -- | @copyFile' old new@ copies the existing file from @old@ to @new@. -- The @old@ file will be tracked as a dependency. -- Also creates the new directory if necessary. copyFile' :: Partial => FilePath -> FilePath -> Action () copyFile' old new = do need [old] putVerbose $ "Copying from " ++ old ++ " to " ++ new liftIO $ do createDirectoryRecursive $ takeDirectory new removeFile_ new -- symlink safety copyFile old new -- | @copyFileChanged old new@ copies the existing file from @old@ to @new@, if the contents have changed. -- The @old@ file will be tracked as a dependency. -- Also creates the new directory if necessary. copyFileChanged :: Partial => FilePath -> FilePath -> Action () copyFileChanged old new = do need [old] -- in newer versions of the directory package we can use copyFileWithMetadata which (we think) updates -- the timestamp as well and thus no need to read the source file twice. unlessM (liftIO $ doesFileExist new &&^ IO.fileEq old new) $ do putVerbose $ "Copying from " ++ old ++ " to " ++ new liftIO $ do createDirectoryRecursive $ takeDirectory new -- copyFile does a lot of clever stuff with permissions etc, so make sure we just reuse it removeFile_ new -- symlink safety liftIO $ copyFile old new -- | Read a file, after calling 'need'. The argument file will be tracked as a dependency. readFile' :: Partial => FilePath -> Action String readFile' x = need [x] >> liftIO (readFile x) -- | Write a file, lifted to the 'Action' monad. writeFile' :: (MonadIO m, Partial) => FilePath -> String -> m () writeFile' name x = liftIO $ do createDirectoryRecursive $ takeDirectory name removeFile_ name -- symlink safety writeFile name x -- | A version of 'readFile'' which also splits the result into lines. -- The argument file will be tracked as a dependency. readFileLines :: Partial => FilePath -> Action [String] readFileLines = fmap lines . readFile' -- | A version of 'writeFile'' which writes out a list of lines. writeFileLines :: (MonadIO m, Partial) => FilePath -> [String] -> m () writeFileLines name = writeFile' name . unlines -- | Write a file, but only if the contents would change. writeFileChanged :: (MonadIO m, Partial) => FilePath -> String -> m () writeFileChanged name x = liftIO $ do createDirectoryRecursive $ takeDirectory name b <- doesFileExist name if not b then writeFile name x else do -- Cannot use ByteString here, since it has different line handling -- semantics on Windows b <- withFile name ReadMode $ \h -> do src <- hGetContents h pure $! src /= x when b $ do removeFile_ name -- symlink safety writeFile name x -- | Create a temporary file in the temporary directory. The file will be deleted -- after the action completes (provided the file is not still open). -- The 'FilePath' will not have any file extension, will exist, and will be zero bytes long. -- If you require a file with a specific name, use 'withTempDir'. withTempFile :: (FilePath -> Action a) -> Action a withTempFile act = do (file, del) <- liftIO IO.newTempFile act file `actionFinally` del -- | Like 'withTempFile' but using a custom temporary directory. withTempFileWithin :: FilePath -> (FilePath -> Action a) -> Action a withTempFileWithin tdir act = do (file, del) <- liftIO $ IO.newTempFileWithin tdir act file `actionFinally` del -- | Create a temporary directory inside the system temporary directory. -- The directory will be deleted after the action completes. As an example: -- -- @ -- 'withTempDir' $ \\mydir -> do -- 'putInfo' $ \"Temp directory is \" ++ mydir -- 'writeFile'' (mydir \ \"test.txt\") \"writing out a temp file\" -- @ withTempDir :: (FilePath -> Action a) -> Action a withTempDir act = do (dir,del) <- liftIO IO.newTempDir act dir `actionFinally` del -- | Like 'withTempDir' but using a custom temporary directory. withTempDirWithin :: FilePath -> (FilePath -> Action a) -> Action a withTempDirWithin tdir act = do (dir,del) <- liftIO $ IO.newTempDirWithin tdir act dir `actionFinally` del -- | A 'parallel' version of 'forM'. forP :: [a] -> (a -> Action b) -> Action [b] forP xs f = parallel $ map f xs -- | Execute two operations in parallel, based on 'parallel'. par :: Action a -> Action b -> Action (a,b) par a b = (\[Left a, Right b] -> (a,b)) <$> parallel [Left <$> a, Right <$> b] -- | Create a finite resource, given a name (for error messages) and a quantity of the resource that exists. -- Shake will ensure that actions using the same finite resource do not execute in parallel. -- As an example, only one set of calls to the Excel API can occur at one time, therefore -- Excel is a finite resource of quantity 1. You can write: -- -- @ -- 'Development.Shake.shake' 'Development.Shake.shakeOptions'{'Development.Shake.shakeThreads'=2} $ do -- 'Development.Shake.want' [\"a.xls\",\"b.xls\"] -- excel <- 'Development.Shake.newResource' \"Excel\" 1 -- \"*.xls\" 'Development.Shake.%>' \\out -> -- 'Development.Shake.withResource' excel 1 $ -- 'Development.Shake.cmd' \"excel\" out ... -- @ -- -- Now the two calls to @excel@ will not happen in parallel. -- -- As another example, calls to compilers are usually CPU bound but calls to linkers are usually -- disk bound. Running 8 linkers will often cause an 8 CPU system to grid to a halt. We can limit -- ourselves to 4 linkers with: -- -- @ -- disk <- 'Development.Shake.newResource' \"Disk\" 4 -- 'Development.Shake.want' [show i 'Development.Shake.FilePath.<.>' \"exe\" | i <- [1..100]] -- \"*.exe\" 'Development.Shake.%>' \\out -> -- 'Development.Shake.withResource' disk 1 $ -- 'Development.Shake.cmd' \"ld -o\" [out] ... -- \"*.o\" 'Development.Shake.%>' \\out -> -- 'Development.Shake.cmd' \"cl -o\" [out] ... -- @ newResource :: String -> Int -> Rules Resource newResource name mx = liftIO $ newResourceIO name mx -- | Create a throttled resource, given a name (for error messages) and a number of resources (the 'Int') that can be -- used per time period (the 'Double' in seconds). Shake will ensure that actions using the same throttled resource -- do not exceed the limits. As an example, let us assume that making more than 1 request every 5 seconds to -- Google results in our client being blacklisted, we can write: -- -- @ -- google <- 'Development.Shake.newThrottle' \"Google\" 1 5 -- \"*.url\" 'Development.Shake.%>' \\out -> do -- 'Development.Shake.withResource' google 1 $ -- 'Development.Shake.cmd' \"wget\" [\"https:\/\/google.com?q=\" ++ 'Development.Shake.FilePath.takeBaseName' out] \"-O\" [out] -- @ -- -- Now we will wait at least 5 seconds after querying Google before performing another query. If Google change the rules to -- allow 12 requests per minute we can instead use @'Development.Shake.newThrottle' \"Google\" 12 60@, which would allow -- greater parallelisation, and avoid throttling entirely if only a small number of requests are necessary. -- -- In the original example we never make a fresh request until 5 seconds after the previous request has /completed/. If we instead -- want to throttle requests since the previous request /started/ we can write: -- -- @ -- google <- 'Development.Shake.newThrottle' \"Google\" 1 5 -- \"*.url\" 'Development.Shake.%>' \\out -> do -- 'Development.Shake.withResource' google 1 $ pure () -- 'Development.Shake.cmd' \"wget\" [\"https:\/\/google.com?q=\" ++ 'Development.Shake.FilePath.takeBaseName' out] \"-O\" [out] -- @ -- -- However, the rule may not continue running immediately after 'Development.Shake.withResource' completes, so while -- we will never exceed an average of 1 request every 5 seconds, we may end up running an unbounded number of -- requests simultaneously. If this limitation causes a problem in practice it can be fixed. newThrottle :: String -> Int -> Double -> Rules Resource newThrottle name count period = liftIO $ newThrottleIO name count period -- | Run an action which uses part of several finite resources. Acquires the resources in a stable -- order, to prevent deadlock. If all rules requiring more than one resource acquire those -- resources with a single call to 'withResources', resources will not deadlock. withResources :: [(Resource, Int)] -> Action a -> Action a withResources res act | (r,i):_ <- filter ((< 0) . snd) res = error $ "You cannot acquire a negative quantity of " ++ show r ++ ", requested " ++ show i | otherwise = f $ groupSort res where f [] = act f ((r,xs):rs) = withResource r (sum xs) $ f rs -- | Given an action on a key, produce a cached version that will execute the action at most once per key per run. -- Using the cached result will still result include any dependencies that the action requires - e.g. if the action -- does 'need' then those dependencies will be added to every rule that uses that cache. -- Each call to 'newCache' creates a separate cache that is independent of all other calls to 'newCache'. -- -- The operations will not be cached between runs and nothing will be persisted to the Shake database. -- For an alternative that does persist the cache, see 'Development.Shake.addOracleCache'. -- -- This function is useful when creating files that store intermediate values, -- to avoid the overhead of repeatedly reading from disk, particularly if the file requires expensive parsing. -- As an example: -- -- @ -- digits \<- 'newCache' $ \\file -> do -- src \<- readFile\' file -- pure $ length $ filter isDigit src -- \"*.digits\" 'Development.Shake.%>' \\x -> do -- v1 \<- digits ('dropExtension' x) -- v2 \<- digits ('dropExtension' x) -- 'Development.Shake.writeFile'' x $ show (v1,v2) -- @ -- -- To create the result @MyFile.txt.digits@ the file @MyFile.txt@ will be read and counted, but only at most -- once per execution. newCache :: (Eq k, Hashable k) => (k -> Action v) -> Rules (k -> Action v) newCache = liftIO . newCacheIO shake-0.19.8/src/Development/Shake/Internal/Errors.hs0000644000000000000000000001277607346545000020602 0ustar0000000000000000{-# LANGUAGE DeriveDataTypeable, PatternGuards, RecordWildCards, ConstraintKinds #-} -- | Errors seen by the user module Development.Shake.Internal.Errors( ShakeException(..), throwM, throwImpure, errorInternal, errorStructured, errorNoRuleToBuildType, errorRuleDefinedMultipleTimes, errorMultipleRulesMatch, errorRuleRecursion, errorComplexRecursion, errorNoApply, errorDirectoryNotFile, errorNoHash ) where import Data.Tuple.Extra import Control.Exception.Extra import Control.Monad.IO.Class import General.Extra import Data.Typeable import Data.List.Extra import Data.Maybe throwM :: MonadIO m => SomeException -> m a throwM = liftIO . throwIO throwImpure :: SomeException -> a throwImpure = throw errorInternal :: Partial => String -> SomeException errorInternal msg = toException $ ErrorCall $ unlines $ ("Development.Shake: Internal error, please report to Neil Mitchell (" ++ msg ++ ")") : callStackFull alternatives = let (*) = (,) in ["_rule_" * "oracle" ,"_Rule_" * "Oracle" ,"_key_" * "question" ,"_Key_" * "Question" ,"_result_" * "answer" ,"_Result_" * "Answer" ,"_addBuiltinRule_" * "addOracle" ,"_apply_" * "askOracle"] errorStructured :: String -> [(String, Maybe String)] -> String -> SomeException errorStructured msg args hint = toException $ ErrorCall $ unlines $ [msg ++ (if null args then "." else ":")] ++ [" " ++ a ++ [':' | a /= ""] ++ replicate (as - length a + 2) ' ' ++ b | (a,b) <- args2] ++ [hint | hint /= ""] where as = maximum $ 0 : map (length . fst) args2 args2 = [(a,b) | (a,Just b) <- args] structured :: Bool -> String -> [(String, Maybe String)] -> String -> SomeException structured alt msg args hint = errorStructured (f msg) (map (first f) args) (f hint) where f = filter (/= '_') . (if alt then g else id) g xs | res:_ <- [to ++ g rest | (from, to) <- alternatives, Just rest <- [stripPrefix from xs]] = res g (x:xs) = x : g xs g [] = [] errorDirectoryNotFile :: FilePath -> SomeException errorDirectoryNotFile dir = errorStructured "Build system error - expected a file, got a directory" [("Directory", Just dir)] "Probably due to calling 'need' on a directory. Shake only permits 'need' on files." errorNoRuleToBuildType :: TypeRep -> Maybe String -> Maybe TypeRep -> SomeException errorNoRuleToBuildType tk k tv = structured (specialIsOracleKey tk) "Build system error - no _rule_ matches the _key_ type" [("_Key_ type", Just $ show tk) ,("_Key_ value", k) ,("_Result_ type", fmap show tv)] "You are missing a call to _addBuiltinRule_, or your call to _apply_ has the wrong _key_ type" errorRuleDefinedMultipleTimes :: TypeRep -> [String] -> SomeException errorRuleDefinedMultipleTimes tk locations = structured (specialIsOracleKey tk) "Build system error - _rule_ defined twice at one _key_ type" (("_Key_ type", Just $ show tk) : [("Location " ++ show i, Just x) | (i, x) <- zipFrom 1 locations]) "You have called _addBuiltinRule_ more than once on the same key type" errorMultipleRulesMatch :: TypeRep -> String -> [Maybe String] -> SomeException errorMultipleRulesMatch tk k names = errorStructured ("Build system error - key matches " ++ (if null names then "no" else "multiple") ++ " rules") ([("Key type",Just $ show tk) ,("Key value",Just k) ,("Rules matched",Just $ show $ length names)] ++ [("Rule " ++ show i, x) | any isJust names, (i, x) <- zipFrom 1 names]) (if null names then "Either add a rule that produces the above key, or stop requiring the above key" else "Modify your rules so only one can produce the above key") errorNoHash :: SomeException errorNoHash = errorStructured "Cannot use shakeChange=ChangeModTime with shakeShare" [] "" errorRuleRecursion :: TypeRep -> String -> SomeException -- may involve both rules and oracle, so report as only rules errorRuleRecursion tk k = errorStructured "Build system error - recursion detected" [("Key type",Just $ show tk) ,("Key value",Just k)] "Rules may not be recursive" errorComplexRecursion :: [String] -> SomeException errorComplexRecursion ks = errorStructured "Build system error - indirect recursion detected" [("Key value " ++ show i, Just k) | (i, k) <- zipFrom 1 ks] "Rules may not be recursive" errorNoApply :: TypeRep -> Maybe String -> String -> SomeException errorNoApply tk k msg = errorStructured "Build system error - cannot currently introduce a dependency (e.g. calling 'apply')" [("Reason", Just msg) ,("Key type", Just $ show tk) ,("Key value", k)] "Move the call earlier/later" -- Should be in Special, but then we get an import cycle specialIsOracleKey :: TypeRep -> Bool specialIsOracleKey t = con == "OracleQ" where con = show $ fst $ splitTyConApp t -- | Error representing all expected exceptions thrown by Shake. -- Problems when executing rules will be raising using this exception type. data ShakeException = ShakeException {shakeExceptionTarget :: String -- ^ The target that was being built when the exception occurred. ,shakeExceptionStack :: [String] -- ^ A description of the call stack, one entry per line. ,shakeExceptionInner :: SomeException -- ^ The underlying exception that was raised. } deriving Typeable instance Exception ShakeException instance Show ShakeException where show ShakeException{..} = unlines $ "Error when running Shake build system:" : shakeExceptionStack ++ [displayException shakeExceptionInner] shake-0.19.8/src/Development/Shake/Internal/FileInfo.hs0000644000000000000000000001447007346545000021012 0ustar0000000000000000{-# LANGUAGE GeneralizedNewtypeDeriving, DeriveDataTypeable, CPP, ForeignFunctionInterface #-} module Development.Shake.Internal.FileInfo( noFileHash, isNoFileHash, FileSize, ModTime, FileHash, getFileHash, getFileInfo ) where #ifndef MIN_VERSION_unix #define MIN_VERSION_unix(a,b,c) 0 #endif #ifndef MIN_VERSION_time #define MIN_VERSION_time(a,b,c) 0 #endif import Data.Hashable import Control.Exception.Extra import Development.Shake.Classes import Development.Shake.Internal.FileName import qualified Data.ByteString.Lazy.Internal as LBS (defaultChunkSize) import Data.List.Extra import Data.Word import Numeric import System.IO import Foreign #if defined(PORTABLE) import System.IO.Error import System.Directory import Data.Time #elif defined(mingw32_HOST_OS) import Development.Shake.Internal.Errors import Control.Monad import qualified Data.ByteString.Char8 as BS import Foreign.C.String import Data.Char #else #if MIN_VERSION_time(1,9,1) import Data.Time.Clock import Data.Fixed #endif import Development.Shake.Internal.Errors import GHC.IO.Exception import System.IO.Error import System.Posix.Files.ByteString #endif -- A piece of file information, where 0 and 1 are special (see fileInfo* functions) newtype FileInfo a = FileInfo Word32 deriving (Typeable,Hashable,Binary,Storable,NFData) noFileHash :: FileHash noFileHash = FileInfo 1 -- Equal to nothing isNoFileHash :: FileHash -> Bool isNoFileHash (FileInfo i) = i == 1 fileInfo :: Word32 -> FileInfo a fileInfo a = FileInfo $ if a > maxBound - 2 then a else a + 2 instance Show (FileInfo a) where show (FileInfo x) | x == 0 = "EQ" | x == 1 = "NEQ" | otherwise = "0x" ++ upper (showHex (x-2) "") instance Eq (FileInfo a) where FileInfo a == FileInfo b | a == 0 || b == 0 = True | a == 1 || b == 1 = False | otherwise = a == b data FileInfoHash; type FileHash = FileInfo FileInfoHash data FileInfoMod ; type ModTime = FileInfo FileInfoMod data FileInfoSize; type FileSize = FileInfo FileInfoSize getFileHash :: FileName -> IO FileHash getFileHash x = withFile (fileNameToString x) ReadMode $ \h -> allocaBytes LBS.defaultChunkSize $ \ptr -> go h ptr (hash ()) where go h ptr salt = do n <- hGetBufSome h ptr LBS.defaultChunkSize if n == 0 then pure $! fileInfo $ fromIntegral salt else go h ptr =<< hashPtrWithSalt ptr n salt -- If the result isn't strict then we are referencing a much bigger structure, -- and it causes a space leak I don't really understand on Linux when running -- the 'tar' test, followed by the 'benchmark' test. -- See this blog post: https://neilmitchell.blogspot.co.uk/2015/09/three-space-leaks.html result :: Word32 -> Word32 -> IO (Maybe (ModTime, FileSize)) result x y = do x <- evaluate $ fileInfo x y <- evaluate $ fileInfo y pure $ Just (x, y) -- | True = allow directory, False = disallow getFileInfo :: Bool -> FileName -> IO (Maybe (ModTime, FileSize)) #if defined(PORTABLE) -- Portable fallback getFileInfo allowDir x = handleBool isDoesNotExistError (const $ pure Nothing) $ do let file = fileNameToString x time <- getModificationTime file size <- withFile file ReadMode hFileSize result (extractFileTime time) (fromIntegral size) extractFileTime :: UTCTime -> Word32 extractFileTime = floor . fromRational . toRational . utctDayTime #elif defined(mingw32_HOST_OS) -- Directly against the Win32 API, twice as fast as the portable version getFileInfo allowDir x = BS.useAsCString (fileNameToByteString x) $ \file -> alloca_WIN32_FILE_ATTRIBUTE_DATA $ \fad -> do res <- c_GetFileAttributesExA file 0 fad let peek = do code <- peekFileAttributes fad if not allowDir && testBit code 4 then throwIO $ errorDirectoryNotFile $ fileNameToString x else join $ liftM2 result (peekLastWriteTimeLow fad) (peekFileSizeLow fad) if res then peek else if BS.any (>= chr 0x80) (fileNameToByteString x) then withCWString (fileNameToString x) $ \file -> do res <- c_GetFileAttributesExW file 0 fad if res then peek else pure Nothing else pure Nothing #ifdef x86_64_HOST_ARCH #define CALLCONV ccall #else #define CALLCONV stdcall #endif foreign import CALLCONV unsafe "Windows.h GetFileAttributesExA" c_GetFileAttributesExA :: CString -> Int32 -> Ptr WIN32_FILE_ATTRIBUTE_DATA -> IO Bool foreign import CALLCONV unsafe "Windows.h GetFileAttributesExW" c_GetFileAttributesExW :: CWString -> Int32 -> Ptr WIN32_FILE_ATTRIBUTE_DATA -> IO Bool data WIN32_FILE_ATTRIBUTE_DATA alloca_WIN32_FILE_ATTRIBUTE_DATA :: (Ptr WIN32_FILE_ATTRIBUTE_DATA -> IO a) -> IO a alloca_WIN32_FILE_ATTRIBUTE_DATA act = allocaBytes size_WIN32_FILE_ATTRIBUTE_DATA act where size_WIN32_FILE_ATTRIBUTE_DATA = 36 peekFileAttributes :: Ptr WIN32_FILE_ATTRIBUTE_DATA -> IO Word32 peekFileAttributes p = peekByteOff p index_WIN32_FILE_ATTRIBUTE_DATA_dwFileAttributes where index_WIN32_FILE_ATTRIBUTE_DATA_dwFileAttributes = 0 peekLastWriteTimeLow :: Ptr WIN32_FILE_ATTRIBUTE_DATA -> IO Word32 peekLastWriteTimeLow p = peekByteOff p index_WIN32_FILE_ATTRIBUTE_DATA_ftLastWriteTime_dwLowDateTime where index_WIN32_FILE_ATTRIBUTE_DATA_ftLastWriteTime_dwLowDateTime = 20 peekFileSizeLow :: Ptr WIN32_FILE_ATTRIBUTE_DATA -> IO Word32 peekFileSizeLow p = peekByteOff p index_WIN32_FILE_ATTRIBUTE_DATA_nFileSizeLow where index_WIN32_FILE_ATTRIBUTE_DATA_nFileSizeLow = 32 #else -- Unix version getFileInfo allowDir x = handleBool isDoesNotExistError' (const $ pure Nothing) $ do s <- getFileStatus $ fileNameToByteString x if not allowDir && isDirectory s then throwM $ errorDirectoryNotFile $ fileNameToString x else result (extractFileTime s) (fromIntegral $ fileSize s) where isDoesNotExistError' e = isDoesNotExistError e || ioeGetErrorType e == InappropriateType extractFileTime :: FileStatus -> Word32 #if MIN_VERSION_unix(2,6,0) #if MIN_VERSION_time(1,9,1) extractFileTime = fromInteger . (\(MkFixed x) -> x) . nominalDiffTimeToSeconds . modificationTimeHiRes #else extractFileTime x = ceiling $ modificationTimeHiRes x * 1e4 #endif #else extractFileTime x = fromIntegral $ fromEnum $ modificationTime x #endif #endif shake-0.19.8/src/Development/Shake/Internal/FileName.hs0000644000000000000000000000505407346545000020775 0ustar0000000000000000{-# LANGUAGE GeneralizedNewtypeDeriving, FlexibleInstances #-} module Development.Shake.Internal.FileName( FileName, fileNameFromString, fileNameFromByteString, fileNameToString, fileNameToByteString, filepathNormalise ) where import qualified Data.ByteString.Char8 as BS import qualified Data.ByteString.UTF8 as UTF8 import Development.Shake.Classes import qualified System.FilePath as Native import General.Binary import System.Info.Extra import Data.List --------------------------------------------------------------------- -- FileName newtype -- | UTF8 ByteString newtype FileName = FileName BS.ByteString deriving (Hashable, Binary, BinaryEx, Eq, NFData) instance Show FileName where show = fileNameToString instance BinaryEx [FileName] where putEx = putEx . map (\(FileName x) -> x) getEx = map FileName . getEx fileNameToString :: FileName -> FilePath fileNameToString = UTF8.toString . fileNameToByteString fileNameToByteString :: FileName -> BS.ByteString fileNameToByteString (FileName x) = x fileNameFromString :: FilePath -> FileName fileNameFromString = fileNameFromByteString . UTF8.fromString fileNameFromByteString :: BS.ByteString -> FileName fileNameFromByteString = FileName . filepathNormalise --------------------------------------------------------------------- -- NORMALISATION -- | Equivalent to @toStandard . normaliseEx@ from "Development.Shake.FilePath". filepathNormalise :: BS.ByteString -> BS.ByteString filepathNormalise xs | isWindows, Just (a,xs) <- BS.uncons xs, sep a, Just (b,_) <- BS.uncons xs, sep b = '/' `BS.cons` f xs | otherwise = f xs where sep = Native.isPathSeparator f o = deslash o $ BS.concat $ (slash:) $ intersperse slash $ reverse $ (BS.empty:) $ g 0 $ reverse $ split o deslash o x | x == slash = case (pre,pos) of (True,True) -> slash (True,False) -> BS.pack "/." (False,True) -> BS.pack "./" (False,False) -> dot | otherwise = (if pre then id else BS.tail) $ (if pos then id else BS.init) x where pre = not (BS.null o) && sep (BS.head o) pos = not (BS.null o) && sep (BS.last o) g i [] = replicate i dotDot g i (x:xs) | BS.null x = g i xs g i (x:xs) | x == dotDot = g (i+1) xs g i (x:xs) | x == dot = g i xs g 0 (x:xs) = x : g 0 xs g i (_:xs) = g (i-1) xs -- equivalent to eliminating ../x split = BS.splitWith sep dotDot = BS.pack ".." dot = BS.singleton '.' slash = BS.singleton '/' shake-0.19.8/src/Development/Shake/Internal/FilePattern.hs0000644000000000000000000003033207346545000021527 0ustar0000000000000000{-# LANGUAGE LambdaCase #-} {-# LANGUAGE PatternGuards, ViewPatterns, TupleSections #-} module Development.Shake.Internal.FilePattern( -- * Primitive API, as exposed FilePattern, (?==), (), -- * General API, used by other people. filePattern, -- * Optimisation opportunities simple, (?==*), -- * Multipattern file rules compatible, extract, substitute, -- * Accelerated searching Walk(..), walk, -- * Testing only internalTest, isRelativePath, isRelativePattern ) where import Development.Shake.Internal.Errors import System.FilePath(isPathSeparator) import Data.List.Extra import Control.Monad import Data.Char import Data.Maybe import System.Info.Extra -- | A type synonym for file patterns, containing @\/\/@ and @*@. For the syntax -- and semantics of 'FilePattern' see '?=='. -- -- Most 'normaliseEx'd 'FilePath' values are suitable as 'FilePattern' values which match -- only that specific file. On Windows @\\@ is treated as equivalent to @\/@. -- -- You can write 'FilePattern' values as a literal string, or build them -- up using the operators 'Development.Shake.FilePath.<.>', 'Development.Shake.FilePath.' -- and 'Development.Shake.'. However, beware that: -- -- * On Windows, use 'Development.Shake.FilePath.<.>' from "Development.Shake.FilePath" instead of from -- "System.FilePath" - otherwise @\"\/\/*\" \<.\> exe@ results in @\"\/\/*\\\\.exe\"@. -- -- * If the second argument of 'Development.Shake.FilePath.' has a leading path separator (namely @\/@) -- then the second argument will be returned. type FilePattern = String infixr 5 -- | Join two 'FilePattern' values by inserting two @\/@ characters between them. -- Will first remove any trailing path separators on the first argument, and any leading -- separators on the second. -- -- > "dir" "*" == "dir//*" () :: FilePattern -> FilePattern -> FilePattern a b = dropWhileEnd isPathSeparator a ++ "//" ++ dropWhile isPathSeparator b --------------------------------------------------------------------- -- PATTERNS data Pat = Lit String -- ^ foo | Star -- ^ /*/ | Skip -- ^ // | Skip1 -- ^ //, but must be at least 1 element | Stars String [String] String -- ^ *foo*, prefix (fixed), infix floaters, suffix -- e.g. *foo*bar = Stars "" ["foo"] "bar" deriving (Show,Eq,Ord) fromLit :: Pat -> Maybe String fromLit (Lit x) = Just x fromLit _ = Nothing data Lexeme = Str String | Slash | SlashSlash lexer :: FilePattern -> [Lexeme] lexer "" = [] lexer (x1:x2:xs) | isPathSeparator x1, isPathSeparator x2 = SlashSlash : lexer xs lexer (x1:xs) | isPathSeparator x1 = Slash : lexer xs lexer xs = Str a : lexer b where (a,b) = break isPathSeparator xs -- | Parse a FilePattern. All optimisations I can think of are invalid because they change the extracted expressions. parse :: FilePattern -> [Pat] parse = f False True . lexer where -- str = I have ever seen a Str go past (equivalent to "can I be satisfied by no paths") -- slash = I am either at the start, or my previous character was Slash f str slash = \case [] -> [Lit "" | slash] Str "**":xs -> Skip : f True False xs Str x:xs -> parseLit x : f True False xs SlashSlash:Slash:xs | not str -> Skip1 : f str True xs SlashSlash:xs -> Skip : f str False xs Slash:xs -> [Lit "" | not str] ++ f str True xs parseLit :: String -> Pat parseLit "*" = Star parseLit x = case split (== '*') x of [x] -> Lit x pre:xs | Just (mid,post) <- unsnoc xs -> Stars pre mid post _ -> Lit "" internalTest :: IO () internalTest = do let x # y = let p = parse x in when (p /= y) $ fail $ show ("FilePattern.internalTest",x,p,y) "" # [Lit ""] "x" # [Lit "x"] "/" # [Lit "",Lit ""] "x/" # [Lit "x",Lit ""] "/x" # [Lit "",Lit "x"] "x/y" # [Lit "x",Lit "y"] "//" # [Skip] "**" # [Skip] "//x" # [Skip, Lit "x"] "**/x" # [Skip, Lit "x"] "x//" # [Lit "x", Skip] "x/**" # [Lit "x", Skip] "x//y" # [Lit "x",Skip, Lit "y"] "x/**/y" # [Lit "x",Skip, Lit "y"] "///" # [Skip1, Lit ""] "**/**" # [Skip,Skip] "**/**/" # [Skip, Skip, Lit ""] "///x" # [Skip1, Lit "x"] "**/x" # [Skip, Lit "x"] "x///" # [Lit "x", Skip, Lit ""] "x/**/" # [Lit "x", Skip, Lit ""] "x///y" # [Lit "x",Skip, Lit "y"] "x/**/y" # [Lit "x",Skip, Lit "y"] "////" # [Skip, Skip] "**/**/**" # [Skip, Skip, Skip] "////x" # [Skip, Skip, Lit "x"] "x////" # [Lit "x", Skip, Skip] "x////y" # [Lit "x",Skip, Skip, Lit "y"] "**//x" # [Skip, Skip, Lit "x"] -- | Optimisations that may change the matched expressions optimise :: [Pat] -> [Pat] optimise (Skip:Skip:xs) = optimise $ Skip:xs optimise (Skip:Star:xs) = optimise $ Skip1:xs optimise (Star:Skip:xs) = optimise $ Skip1:xs optimise (x:xs) = x : optimise xs optimise [] =[] -- | A 'FilePattern' that will only match 'isRelativePath' values. isRelativePattern :: FilePattern -> Bool isRelativePattern ('*':'*':xs) | [] <- xs = True | x:_ <- xs, isPathSeparator x = True isRelativePattern _ = False -- | A non-absolute 'FilePath'. isRelativePath :: FilePath -> Bool isRelativePath (x:_) | isPathSeparator x = False isRelativePath (x:':':_) | isWindows, isAlpha x = False isRelativePath _ = True -- | Given a pattern, and a list of path components, return a list of all matches -- (for each wildcard in order, what the wildcard matched). match :: [Pat] -> [String] -> [[String]] match (Skip:xs) (y:ys) = map ("":) (match xs (y:ys)) ++ match (Skip1:xs) (y:ys) match (Skip1:xs) (y:ys) = [(y++"/"++r):rs | r:rs <- match (Skip:xs) ys] match (Skip:xs) [] = map ("":) $ match xs [] match (Star:xs) (y:ys) = map (y:) $ match xs ys match (Lit x:xs) (y:ys) = concat $ [match xs ys | x == y] ++ [match xs (y:ys) | x == "."] match (x@Stars{}:xs) (y:ys) | Just rs <- matchStars x y = map (rs ++) $ match xs ys match [] [] = [[]] match _ _ = [] matchOne :: Pat -> String -> Bool matchOne (Lit x) y = x == y matchOne x@Stars{} y = isJust $ matchStars x y matchOne Star _ = True matchOne p _ = throwImpure $ errorInternal $ "unreachablePattern, matchOne " ++ show p -- Only return the first (all patterns left-most) valid star matching matchStars :: Pat -> String -> Maybe [String] matchStars (Stars pre mid post) x = do x <- stripPrefix pre x x <- if null post then Just x else stripSuffix post x stripInfixes mid x where stripInfixes [] x = Just [x] stripInfixes (m:ms) x = do (a,x) <- stripInfix m x (a:) <$> stripInfixes ms x matchStars p _ = throwImpure $ errorInternal $ "unreachablePattern, matchStars " ++ show p -- | Match a 'FilePattern' against a 'FilePath', There are three special forms: -- -- * @*@ matches an entire path component, excluding any separators. -- -- * @\/\/@ matches an arbitrary number of path components, including absolute path -- prefixes. -- -- * @**@ as a path component matches an arbitrary number of path components, but not -- absolute path prefixes. -- Currently considered experimental. -- -- Some examples: -- -- * @test.c@ matches @test.c@ and nothing else. -- -- * @*.c@ matches all @.c@ files in the current directory, so @file.c@ matches, -- but @file.h@ and @dir\/file.c@ don't. -- -- * @\/\/*.c@ matches all @.c@ files anywhere on the filesystem, -- so @file.c@, @dir\/file.c@, @dir1\/dir2\/file.c@ and @\/path\/to\/file.c@ all match, -- but @file.h@ and @dir\/file.h@ don't. -- -- * @dir\/*\/*@ matches all files one level below @dir@, so @dir\/one\/file.c@ and -- @dir\/two\/file.h@ match, but @file.c@, @one\/dir\/file.c@, @dir\/file.h@ -- and @dir\/one\/two\/file.c@ don't. -- -- Patterns with constructs such as @foo\/..\/bar@ will never match -- normalised 'FilePath' values, so are unlikely to be correct. (?==) :: FilePattern -> FilePath -> Bool (?==) p = case optimise $ parse p of [x] | x == Skip || x == Skip1 -> if rp then isRelativePath else const True p -> let f = not . null . match p . split isPathSeparator in if rp then (\x -> isRelativePath x && f x) else f where rp = isRelativePattern p (?==*) :: [FilePattern] -> FilePath -> Bool (?==*) ps = \x -> any ($ x) vs where vs = map (?==) ps -- | Like '?==', but returns 'Nothing' on if there is no match, otherwise 'Just' with the list -- of fragments matching each wildcard. For example: -- -- @ -- 'filePattern' \"**\/*.c\" \"test.txt\" == Nothing -- 'filePattern' \"**\/*.c\" \"foo.c\" == Just [\"",\"foo\"] -- 'filePattern' \"**\/*.c\" \"bar\/baz\/foo.c\" == Just [\"bar\/baz/\",\"foo\"] -- @ -- -- Note that the @**@ will often contain a trailing @\/@, and even on Windows any -- @\\@ separators will be replaced by @\/@. filePattern :: FilePattern -> FilePath -> Maybe [String] filePattern p = \x -> if eq x then Just $ ex x else Nothing where eq = (?==) p ex = extract p --------------------------------------------------------------------- -- MULTIPATTERN COMPATIBLE SUBSTITUTIONS specials :: FilePattern -> [Pat] specials = concatMap f . parse where f Lit{} = [] f Star = [Star] f Skip = [Skip] f Skip1 = [Skip] f (Stars _ xs _) = replicate (length xs + 1) Star -- | Is the pattern free from any * and //. simple :: FilePattern -> Bool simple = null . specials -- | Do they have the same * and // counts in the same order compatible :: [FilePattern] -> Bool compatible [] = True compatible (x:xs) = all ((==) (specials x) . specials) xs -- | Extract the items that match the wildcards. The pair must match with '?=='. extract :: FilePattern -> FilePath -> [String] extract p = let pat = parse p in \x -> case match pat (split isPathSeparator x) of [] | p ?== x -> throwImpure $ errorInternal $ "extract with " ++ show p ++ " and " ++ show x | otherwise -> error $ "Pattern " ++ show p ++ " does not match " ++ x ++ ", when trying to extract the FilePattern matches" ms:_ -> ms -- | Given the result of 'extract', substitute it back in to a 'compatible' pattern. -- -- > p '?==' x ==> substitute (extract p x) p == x substitute :: [String] -> FilePattern -> FilePath substitute oms oxs = intercalate "/" $ concat $ snd $ mapAccumL f oms (parse oxs) where f ms (Lit x) = (ms, [x]) f (m:ms) Star = (ms, [m]) f (m:ms) Skip = (ms, split m) f (m:ms) Skip1 = (ms, split m) f ms (Stars pre mid post) = (ms2, [concat $ pre : zipWith (++) ms1 (mid++[post])]) where (ms1,ms2) = splitAt (length mid + 1) ms f _ _ = error $ "Substitution failed into pattern " ++ show oxs ++ " with " ++ show (length oms) ++ " matches, namely " ++ show oms split = linesBy (== '/') --------------------------------------------------------------------- -- EFFICIENT PATH WALKING -- | Given a list of files, return a list of things I can match in this directory -- plus a list of subdirectories and walks that apply to them. -- Use WalkTo when the list can be predicted in advance data Walk = Walk ([String] -> ([String],[(String,Walk)])) | WalkTo ([String],[(String,Walk)]) walk :: [FilePattern] -> (Bool, Walk) walk ps = (any (\p -> isEmpty p || not (null $ match p [""])) ps2, f ps2) where ps2 = map (filter (/= Lit ".") . optimise . parse) ps f (nubOrd -> ps) | Just fin <- mapM fromLit fin , Just nxt <- mapM (\(a,b) -> (,f b) <$> fromLit a) nxt = WalkTo (fin, nxt) | otherwise = Walk $ \xs -> (if finStar then xs else filter (\x -> any (`matchOne` x) fin) xs ,[(x, f ys) | x <- xs, let ys = concat [b | (a,b) <- nxt, matchOne a x], not $ null ys]) where finStar = Star `elem` fin fin = nubOrd $ mapMaybe final ps nxt = groupSort $ concatMap next ps next :: [Pat] -> [(Pat, [Pat])] next (Skip1:xs) = [(Star,Skip:xs)] next (Skip:xs) = (Star,Skip:xs) : next xs next (x:xs) = [(x,xs) | not $ null xs] next [] = [] final :: [Pat] -> Maybe Pat final (Skip:xs) = if isEmpty xs then Just Star else final xs final (Skip1:xs) = if isEmpty xs then Just Star else Nothing final (x:xs) = if isEmpty xs then Just x else Nothing final [] = Nothing isEmpty = all (== Skip) shake-0.19.8/src/Development/Shake/Internal/History/0000755000000000000000000000000007346545000020416 5ustar0000000000000000shake-0.19.8/src/Development/Shake/Internal/History/Bloom.hs0000644000000000000000000000302007346545000022015 0ustar0000000000000000{-# LANGUAGE ViewPatterns #-} -- | The endpoints on the cloud server module Development.Shake.Internal.History.Bloom( Bloom, bloomTest, bloomCreate ) where import Data.Word import Data.Bits import Data.Hashable import Data.Semigroup import Foreign.Storable import Foreign.Ptr import Prelude -- | Given an Int hash we store data Bloom a = Bloom {-# UNPACK #-} !Word64 {-# UNPACK #-} !Word64 {-# UNPACK #-} !Word64 {-# UNPACK #-} !Word64 deriving (Eq,Show) instance Storable (Bloom a) where sizeOf _ = 4 * sizeOf (0 :: Word64) alignment _ = alignment (0 :: Word64) peek (castPtr -> ptr) = Bloom <$> peekElemOff ptr 0 <*> peekElemOff ptr 1 <*> peekElemOff ptr 2 <*> peekElemOff ptr 3 poke (castPtr -> ptr) (Bloom x1 x2 x3 x4) = do pokeElemOff ptr 0 x1 pokeElemOff ptr 1 x2 pokeElemOff ptr 2 x3 pokeElemOff ptr 3 x4 instance Semigroup (Bloom a) where Bloom x1 x2 x3 x4 <> Bloom y1 y2 y3 y4 = Bloom (x1 .|. y1) (x2 .|. y2) (x3 .|. y3) (x4 .|. y4) instance Monoid (Bloom a) where mempty = Bloom 0 0 0 0 mappend = (<>) -- Should the cloud need to know about Key's? It only needs to do Eq on them... -- If you Key has a smart Eq your build tree might be more diverse -- Have the Id resolved in Server. bloomTest :: Hashable a => Bloom a -> a -> Bool bloomTest bloom x = bloomCreate x <> bloom == bloom bloomCreate :: Hashable a => a -> Bloom a bloomCreate (fromIntegral . hash -> x) = Bloom (f 1) (f 2) (f 3) (f 4) where f i = x `xor` rotate x i shake-0.19.8/src/Development/Shake/Internal/History/Cloud.hs0000644000000000000000000000613107346545000022021 0ustar0000000000000000 -- | The endpoints on the server module Development.Shake.Internal.History.Cloud( Cloud, newCloud, addCloud, lookupCloud ) where import Development.Shake.Internal.Value import Development.Shake.Internal.Core.Database import Development.Shake.Internal.History.Types import Development.Shake.Internal.History.Network import Development.Shake.Internal.History.Server import Development.Shake.Internal.History.Bloom import Control.Concurrent.Extra import System.Time.Extra import Control.Monad import Control.Monad.IO.Class import Control.Monad.Trans.Maybe import Control.Monad.Trans.Class import General.Fence import qualified Data.HashMap.Strict as Map import Data.Typeable import Data.Either.Extra import General.Binary import General.Extra import General.Wait type Initial = Map.HashMap Key (Ver, [Key], Bloom [BS_Identity]) data Cloud = Cloud Server (Locked () -> IO ()) (Fence Locked Initial) newLaterFence :: (Locked () -> IO ()) -> Seconds -> a -> IO a -> IO (Fence Locked a) newLaterFence relock maxTime def act = do fence <- newFence forkFinally (timeout maxTime act) $ \res -> relock $ signalFence fence $ case res of Right (Just v) -> v _ -> def pure fence laterFence :: MonadIO m => Fence m a -> Wait m a laterFence fence = do res <- liftIO $ testFence fence case res of Just v -> pure v Nothing -> Later $ waitFence fence newCloud :: (Locked () -> IO ()) -> Map.HashMap TypeRep (BinaryOp Key) -> Ver -> [(TypeRep, Ver)] -> [String] -> Maybe (IO Cloud) newCloud relock binop globalVer ruleVer urls = flip fmap (if null urls then Nothing else connect $ last urls) $ \conn -> do conn <- conn server <- newServer conn binop globalVer fence <- newLaterFence relock 10 Map.empty $ do xs <- serverAllKeys server ruleVer pure $ Map.fromList [(k,(v,ds,test)) | (k,v,ds,test) <- xs] pure $ Cloud server relock fence addCloud :: Cloud -> Key -> Ver -> Ver -> [[(Key, BS_Identity)]] -> BS_Store -> [FilePath] -> IO () addCloud (Cloud server _ _) x1 x2 x3 x4 x5 x6 = void $ forkIO $ serverUpload server x1 x2 x3 x4 x5 x6 lookupCloud :: Cloud -> (Key -> Wait Locked (Maybe BS_Identity)) -> Key -> Ver -> Ver -> Wait Locked (Maybe (BS_Store, [[Key]], IO ())) lookupCloud (Cloud server relock initial) ask key builtinVer userVer = runMaybeT $ do mp <- lift $ laterFence initial Just (ver, deps, bloom)<- pure $ Map.lookup key mp unless (ver == userVer) $ fail "" Right vs <- lift $ firstLeftWaitUnordered (fmap (maybeToEither ()) . ask) deps unless (bloomTest bloom vs) $ fail "" fence <- liftIO $ newLaterFence relock 10 mempty $ serverOneKey server key builtinVer userVer $ zip deps vs tree <- lift $ laterFence fence f [deps] tree where f :: [[Key]] -> BuildTree Key -> MaybeT (Wait Locked) (BS_Store, [[Key]], IO ()) f ks (Done store xs) = pure (store, reverse ks, serverDownloadFiles server key xs) f ks (Depend deps trees) = do Right vs <- lift $ firstLeftWaitUnordered (fmap (maybeToEither ()) . ask) deps Just tree<- pure $ lookup vs trees f (deps:ks) tree shake-0.19.8/src/Development/Shake/Internal/History/Network.hs0000644000000000000000000000234007346545000022402 0ustar0000000000000000{-# LANGUAGE CPP #-} -- | The network operations available module Development.Shake.Internal.History.Network( Conn, connect, post ) where #ifdef NETWORK import Network.HTTP import Network.URI import Data.List import Data.Maybe #endif import qualified Data.ByteString.Lazy as LBS newtype Conn = Conn String connect :: String -> Maybe (IO Conn) post :: Conn -> String -> LBS.ByteString -> IO LBS.ByteString #ifndef NETWORK connect _ = Nothing post (Conn _) _ _ = fail "impossible to get here" #else connect x = Just $ pure $ Conn $ x ++ ['/' | not $ "/" `isSuffixOf` x] post (Conn prefix) url send = do let request = Request {rqURI = parseURI_ $ prefix ++ url ,rqMethod = POST ,rqHeaders = [Header HdrContentType "application/octet-stream", Header HdrContentLength $ show $ LBS.length send] ,rqBody = send} response <- simpleHTTP request case response of Left e -> fail $ "Network.post, failed: " ++ show e Right v | rspCode v /= (2,0,0) -> fail $ "Network.post, failed: " ++ show (rspCode v) | otherwise -> pure $ rspBody v parseURI_ :: String -> URI parseURI_ x = fromMaybe (error $ "Failed to parse URI, " ++ x) $ parseURI x #endif shake-0.19.8/src/Development/Shake/Internal/History/Serialise.hs0000644000000000000000000000576307346545000022705 0ustar0000000000000000{-# LANGUAGE FlexibleInstances, DeriveTraversable #-} -- | The endpoints on the cloud server module Development.Shake.Internal.History.Serialise( BuildTree(..), WithTypeReps(..), withTypeReps, WithKeys(..), withKeys, withIds, withoutKeys, SendAllKeys(..), RecvAllKeys(..), SendOneKey(..), RecvOneKey(..), SendDownloadFiles(..), SendUpload(..) ) where import Development.Shake.Internal.History.Bloom import General.Extra import General.Binary import General.Ids import Data.List.Extra import Development.Shake.Internal.Value import Development.Shake.Internal.FileInfo import Development.Shake.Internal.History.Types import qualified Data.HashMap.Strict as Map import Data.Semigroup import Data.Typeable import Prelude data BuildTree key -- invariant: Entries are sorted = Depend [key] [([BS_Identity], BuildTree key)] | Done BS_Store [(FilePath, FileSize, FileHash)] instance BinaryEx (BuildTree Int) where getEx = undefined putEx = undefined instance Eq key => Semigroup (BuildTree key) where Depend ks1 vs1 <> Depend ks2 vs2 | ks1 == ks2 = Depend ks1 $ mergeBy undefined vs1 vs2 | otherwise = Depend ks2 vs2 -- this shouldn't happen, so give up x@Done{} <> _ = x _ <> y@Done{} = y instance Eq key => Monoid (BuildTree key) where mempty = Depend [] [] mappend = (<>) data WithTypeReps a = WithTypeReps [BS_QTypeRep] a instance BinaryEx a => BinaryEx (WithTypeReps a) where putEx = undefined getEx = undefined withTypeReps :: Traversable f => f TypeRep -> WithTypeReps (f Int) withTypeReps = undefined data WithKeys a = WithKeys [BS_Key] a instance BinaryEx a => BinaryEx (WithKeys a) where putEx = undefined getEx = undefined withKeys :: Traversable f => f Key -> WithKeys (f Int) withKeys = undefined withIds :: Traversable f => (Id -> m Key) -> f Id -> m (WithKeys (f Int)) withIds = undefined withoutKeys :: Map.HashMap TypeRep (BinaryOp Key) -> WithKeys (f Int) -> f Key withoutKeys = undefined data SendAllKeys typ = SendAllKeys Ver [(typ, Ver)] deriving (Functor, Foldable, Traversable) instance BinaryEx (SendAllKeys Int) where putEx = undefined getEx = undefined newtype RecvAllKeys key = RecvAllKeys [(key, Ver, [key], Bloom [BS_Identity])] instance BinaryEx (RecvAllKeys Int) where getEx = undefined putEx = undefined data SendOneKey key = SendOneKey Ver key Ver Ver [(key, BS_Identity)] instance BinaryEx (SendOneKey Int) where getEx = undefined putEx = undefined newtype RecvOneKey key = RecvOneKey (BuildTree key) instance BinaryEx (RecvOneKey Int) where getEx = undefined putEx = undefined data SendDownloadFiles key = SendDownloadFiles Ver key Ver Ver [(FilePath, FileSize, FileHash)] instance BinaryEx (SendDownloadFiles Int) where getEx = undefined putEx = undefined data SendUpload key = SendUpload Ver key Ver Ver [[(key, BS_Identity)]] BS_Store [(FilePath, FileSize, FileHash)] instance BinaryEx (SendUpload Int) where getEx = undefined putEx = undefined shake-0.19.8/src/Development/Shake/Internal/History/Server.hs0000644000000000000000000000321507346545000022221 0ustar0000000000000000 -- | The endpoints on the cloud server module Development.Shake.Internal.History.Server( Server, BuildTree(..), newServer, serverAllKeys, serverOneKey, serverDownloadFiles, serverUpload ) where import Development.Shake.Internal.History.Bloom import Development.Shake.Internal.History.Serialise import Development.Shake.Internal.Value import General.Binary import General.Extra import qualified Data.HashMap.Strict as Map import qualified Data.ByteString.Lazy as LBS import qualified Data.ByteString as BS import Development.Shake.Internal.FileInfo import Development.Shake.Internal.History.Types import Development.Shake.Internal.History.Network import Data.Typeable data Server = Server Conn (Map.HashMap TypeRep (BinaryOp Key)) Ver newServer :: Conn -> Map.HashMap TypeRep (BinaryOp Key) -> Ver -> IO Server newServer a b c = pure $ Server a b c serverAllKeys :: Server -> [(TypeRep, Ver)] -> IO [(Key, Ver, [Key], Bloom [BS_Identity])] serverAllKeys (Server conn key ver) typs = do res <- post conn "allkeys/v1" $ LBS.fromChunks [runBuilder $ putEx $ withTypeReps $ SendAllKeys ver typs] let RecvAllKeys ans = withoutKeys key $ getEx $ BS.concat $ LBS.toChunks res pure ans serverOneKey :: Server -> Key -> Ver -> Ver -> [(Key, BS_Identity)] -> IO (BuildTree Key) serverOneKey _ _ _ _ _ = pure $ Depend [] [] serverDownloadFiles :: Server -> Key -> [(FilePath, FileSize, FileHash)] -> IO () serverDownloadFiles _ _ _ = fail "Failed to download the files" serverUpload :: Server -> Key -> Ver -> Ver -> [[(Key, BS_Identity)]] -> BS_Store -> [FilePath] -> IO () serverUpload _ key _ _ _ _ _ = print ("SERVER", "Uploading key", key) shake-0.19.8/src/Development/Shake/Internal/History/Shared.hs0000644000000000000000000001767407346545000022177 0ustar0000000000000000{-# LANGUAGE RecordWildCards, ScopedTypeVariables, TupleSections #-} module Development.Shake.Internal.History.Shared( Shared, newShared, addShared, lookupShared, removeShared, listShared, sanityShared ) where import Control.Exception import Development.Shake.Internal.Value import Development.Shake.Internal.History.Types import Development.Shake.Internal.History.Symlink import Development.Shake.Internal.Core.Database import Development.Shake.Classes import General.Binary import General.Extra import Data.List import Control.Monad.Extra import System.Directory.Extra import System.FilePath import System.IO.Extra import Numeric import Development.Shake.Internal.FileInfo import General.Wait import Development.Shake.Internal.FileName import Data.Monoid import Control.Monad.IO.Class import Data.Maybe import qualified Data.ByteString as BS import Prelude data Shared = Shared {globalVersion :: !Ver ,keyOp :: BinaryOp Key ,sharedRoot :: FilePath ,useSymlink :: Bool } newShared :: Bool -> BinaryOp Key -> Ver -> FilePath -> IO Shared newShared useSymlink keyOp globalVersion sharedRoot = pure Shared{..} data Entry = Entry {entryKey :: Key ,entryGlobalVersion :: !Ver ,entryBuiltinVersion :: !Ver ,entryUserVersion :: !Ver ,entryDepends :: [[(Key, BS_Identity)]] ,entryResult :: BS_Store ,entryFiles :: [(FilePath, FileHash)] } deriving (Show, Eq) putEntry :: BinaryOp Key -> Entry -> Builder putEntry binop Entry{..} = putExStorable entryGlobalVersion <> putExStorable entryBuiltinVersion <> putExStorable entryUserVersion <> putExN (putOp binop entryKey) <> putExN (putExList $ map (putExList . map putDepend) entryDepends) <> putExN (putExList $ map putFile entryFiles) <> putEx entryResult where putDepend (a,b) = putExN (putOp binop a) <> putEx b putFile (a,b) = putExStorable b <> putEx a getEntry :: BinaryOp Key -> BS.ByteString -> Entry getEntry binop x | (x1, x2, x3, x) <- binarySplit3 x , (x4, x) <- getExN x , (x5, x) <- getExN x , (x6, x7) <- getExN x = Entry {entryGlobalVersion = x1 ,entryBuiltinVersion = x2 ,entryUserVersion = x3 ,entryKey = getOp binop x4 ,entryDepends = map (map getDepend . getExList) $ getExList x5 ,entryFiles = map getFile $ getExList x6 ,entryResult = getEx x7 } where getDepend x | (a, b) <- getExN x = (getOp binop a, getEx b) getFile x | (b, a) <- binarySplit x = (getEx a, b) hexed x = showHex (abs $ hash x) "" -- | The path under which everything relating to a Key lives sharedFileDir :: Shared -> Key -> FilePath sharedFileDir shared key = sharedRoot shared ".shake.cache" hexed key -- | The list of files containing Entry values, given a result of 'sharedFileDir' sharedFileKeys :: FilePath -> IO [FilePath] sharedFileKeys dir = do b <- doesDirectoryExist_ $ dir "_key" if not b then pure [] else listFiles $ dir "_key" loadSharedEntry :: Shared -> Key -> Ver -> Ver -> IO [IO (Maybe Entry)] loadSharedEntry shared@Shared{..} key builtinVersion userVersion = map f <$> sharedFileKeys (sharedFileDir shared key) where f file = do e@Entry{..} <- getEntry keyOp <$> BS.readFile file let valid = entryKey == key && entryGlobalVersion == globalVersion && entryBuiltinVersion == builtinVersion && entryUserVersion == userVersion pure $ if valid then Just e else Nothing -- | Given a way to get the identity, see if you can find a stored cloud version lookupShared :: Shared -> (Key -> Wait Locked (Maybe BS_Identity)) -> Key -> Ver -> Ver -> Wait Locked (Maybe (BS_Store, [[Key]], IO ())) lookupShared shared ask key builtinVersion userVersion = do ents <- liftIO $ loadSharedEntry shared key builtinVersion userVersion flip firstJustWaitUnordered ents $ \act -> do me <- liftIO act case me of Nothing -> pure Nothing Just Entry{..} -> do -- use Nothing to indicate success, Just () to bail out early on mismatch let result x = if isJust x then Nothing else Just $ (entryResult, map (map fst) entryDepends, ) $ do let dir = sharedFileDir shared entryKey forM_ entryFiles $ \(file, hash) -> copyFileLink (useSymlink shared) (dir show hash) file result <$> firstJustM id [ firstJustWaitUnordered id [ test <$> ask k | (k, i1) <- kis , let test = maybe (Just ()) (\i2 -> if i1 == i2 then Nothing else Just ())] | kis <- entryDepends] saveSharedEntry :: Shared -> Entry -> IO () saveSharedEntry shared entry = do let dir = sharedFileDir shared (entryKey entry) createDirectoryRecursive dir forM_ (entryFiles entry) $ \(file, hash) -> unlessM (doesFileExist_ $ dir show hash) $ copyFileLink (useSymlink shared) file (dir show hash) -- Write key after files to make sure cache is always useable let v = runBuilder $ putEntry (keyOp shared) entry let dirName = dir "_key" createDirectoryRecursive dirName -- #757, make sure we write this file atomically (tempFile, cleanUp) <- newTempFileWithin dir (BS.writeFile tempFile v >> renameFile tempFile (dirName hexed v)) `onException` cleanUp addShared :: Shared -> Key -> Ver -> Ver -> [[(Key, BS_Identity)]] -> BS_Store -> [FilePath] -> IO () addShared shared entryKey entryBuiltinVersion entryUserVersion entryDepends entryResult files = do files <- mapM (\x -> (x,) <$> getFileHash (fileNameFromString x)) files saveSharedEntry shared Entry{entryFiles = files, entryGlobalVersion = globalVersion shared, ..} removeShared :: Shared -> (Key -> Bool) -> IO () removeShared Shared{..} test = do dirs <- listDirectories $ sharedRoot ".shake.cache" deleted <- forM dirs $ \dir -> do files <- sharedFileKeys dir -- if any key matches, clean them all out b <- flip anyM files $ \file -> handleSynchronous (\e -> putStrLn ("Warning: " ++ show e) >> pure False) $ evaluate . test . entryKey . getEntry keyOp =<< BS.readFile file when b $ removePathForcibly dir pure b liftIO $ putStrLn $ "Deleted " ++ show (length (filter id deleted)) ++ " entries" listShared :: Shared -> IO () listShared Shared{..} = do dirs <- listDirectories $ sharedRoot ".shake.cache" forM_ dirs $ \dir -> do putStrLn $ "Directory: " ++ dir keys <- sharedFileKeys dir forM_ keys $ \key -> handleSynchronous (\e -> putStrLn $ "Warning: " ++ show e) $ do Entry{..} <- getEntry keyOp <$> BS.readFile key putStrLn $ " Key: " ++ show entryKey forM_ entryFiles $ \(file,_) -> putStrLn $ " File: " ++ file sanityShared :: Shared -> IO () sanityShared Shared{..} = do dirs <- listDirectories $ sharedRoot ".shake.cache" forM_ dirs $ \dir -> do putStrLn $ "Directory: " ++ dir keys <- sharedFileKeys dir forM_ keys $ \key -> handleSynchronous (\e -> putStrLn $ "Warning: " ++ show e) $ do Entry{..} <- getEntry keyOp <$> BS.readFile key putStrLn $ " Key: " ++ show entryKey putStrLn $ " Key file: " ++ key forM_ entryFiles $ \(file,hash) -> checkFile file dir hash where checkFile filename dir keyHash = do let cachefile = dir show keyHash putStrLn $ " File: " ++ filename putStrLn $ " Cache file: " ++ cachefile ifM (not <$> doesFileExist_ cachefile) (putStrLn " Error: cache file does not exist") $ ifM ((/= keyHash) <$> getFileHash (fileNameFromString cachefile)) (putStrLn " Error: cache file hash does not match stored hash") (putStrLn " OK") shake-0.19.8/src/Development/Shake/Internal/History/Symlink.hs0000644000000000000000000000265507346545000022410 0ustar0000000000000000{-# LANGUAGE CPP #-} module Development.Shake.Internal.History.Symlink( copyFileLink, createLinkMaybe ) where import Control.Monad.Extra import General.Extra import System.Directory import System.FilePath #ifdef mingw32_HOST_OS import Foreign.Ptr import Foreign.C.String #else import System.Posix.Files(createLink) #endif createLinkMaybe :: FilePath -> FilePath -> IO (Maybe String) #ifdef mingw32_HOST_OS #ifdef x86_64_HOST_ARCH #define CALLCONV ccall #else #define CALLCONV stdcall #endif foreign import CALLCONV unsafe "Windows.h CreateHardLinkW " c_CreateHardLinkW :: CWString -> CWString -> Ptr () -> IO Bool createLinkMaybe from to = withCWString from $ \cfrom -> withCWString to $ \cto -> do res <- c_CreateHardLinkW cto cfrom nullPtr pure $ if res then Nothing else Just "CreateHardLink failed." #else createLinkMaybe from to = handleIO (pure . Just . show) $ createLink from to >> pure Nothing #endif copyFileLink :: Bool -> FilePath -> FilePath -> IO () copyFileLink useSymlink from to = do createDirectoryRecursive $ takeDirectory to removeFile_ to if not useSymlink then copyFile from to else do b <- createLinkMaybe from to whenJust b $ \_ -> copyFile from to -- making files read only stops them from inadvertently mutating the cache forM_ [from, to] $ \x -> do perm <- getPermissions x setPermissions x perm{writable=False} shake-0.19.8/src/Development/Shake/Internal/History/Types.hs0000644000000000000000000000042207346545000022054 0ustar0000000000000000 module Development.Shake.Internal.History.Types( BS_QTypeRep, BS_Key, BS_Store, BS_Identity ) where import qualified Data.ByteString as BS type BS_QTypeRep = BS.ByteString type BS_Key = BS.ByteString type BS_Store = BS.ByteString type BS_Identity = BS.ByteString shake-0.19.8/src/Development/Shake/Internal/Options.hs0000644000000000000000000005036007346545000020750 0ustar0000000000000000{-# LANGUAGE DeriveDataTypeable, PatternGuards #-} -- | Types exposed to the user module Development.Shake.Internal.Options( Progress(..), Verbosity(..), Rebuild(..), Lint(..), Change(..), ShakeOptions(..), shakeOptions, -- Internal stuff shakeRebuildApply, shakeAbbreviationsApply, shakeOptionsFields ) where import Data.Data import Data.List.Extra import Data.Tuple.Extra import Data.Maybe import Data.Dynamic import Control.Monad import General.Extra import System.Time.Extra import qualified Data.HashMap.Strict as Map import Development.Shake.Internal.FilePattern import Development.Shake.Internal.Errors import qualified Data.ByteString.Char8 as BS import qualified Data.ByteString.UTF8 as UTF8 import Development.Shake.Internal.CmdOption import Data.Semigroup import Prelude -- | The current assumptions made by the build system, used by 'shakeRebuild'. These options -- allow the end user to specify that any rules run are either to be treated as clean, or as -- dirty, regardless of what the build system thinks. -- -- These assumptions only operate on files reached by the current 'Development.Shake.action' commands. Any -- other files in the database are left unchanged. data Rebuild = RebuildNow -- ^ Assume these files are dirty and require rebuilding. -- for benchmarking rebuild speed and for rebuilding if untracked dependencies have changed. -- This flag is safe, but may cause more rebuilding than necessary. | RebuildNormal -- ^ Useful to reset the rebuild status to how it was before, equivalent to passing no 'Rebuild' flags. | RebuildLater -- ^ /This assumption is unsafe, and may lead to incorrect build results in this run/. -- Assume these files are clean in this run, but test them normally in future runs. {- | RebuildNever -- Add to RebuildNow: Useful to undo the results of 'RebuildNever', -- ^ /This assumption is unsafe, and may lead to incorrect build results in this run, and in future runs/. -- Assume and record that these files are clean and do not require rebuilding, provided the file -- has been built before. Useful if you have modified a file in some -- inconsequential way, such as only the comments or whitespace, and wish to avoid a rebuild. -} deriving (Eq,Ord,Show,Read,Typeable,Data,Enum,Bounded) -- | Which lint checks to perform, used by 'shakeLint'. data Lint = LintBasic -- ^ The most basic form of linting. Checks that the current directory does not change and that results do not change after they -- are first written. Any calls to 'needed' will assert that they do not cause a rule to be rebuilt. | LintFSATrace -- ^ Track which files are accessed by command line programs -- using . deriving (Eq,Ord,Show,Read,Typeable,Data,Enum,Bounded) -- | How should you determine if a file has changed, used by 'shakeChange'. The most common values are -- 'ChangeModtime' (the default, very fast, @touch@ causes files to rebuild) and 'ChangeModtimeAndDigestInput' -- (slightly slower, @touch@ and switching @git@ branches does not cause input files to rebuild). data Change = ChangeModtime -- ^ Compare equality of modification timestamps, a file has changed if its last modified time changes. -- A @touch@ will force a rebuild. This mode is fast and usually sufficiently accurate, so is the default. | ChangeDigest -- ^ Compare equality of file contents digests, a file has changed if its digest changes. -- A @touch@ will not force a rebuild. Use this mode if modification times on your file system are unreliable. | ChangeModtimeAndDigest -- ^ A file is rebuilt if both its modification time and digest have changed. For efficiency reasons, the modification -- time is checked first, and if that has changed, the digest is checked. | ChangeModtimeAndDigestInput -- ^ Use 'ChangeModtimeAndDigest' for input\/source files and 'ChangeModtime' for output files. -- An input file is one which is a dependency but is not built by Shake as it has no -- matching rule and already exists on the file system. | ChangeModtimeOrDigest -- ^ A file is rebuilt if either its modification time or its digest has changed. A @touch@ will force a rebuild, -- but even if a files modification time is reset afterwards, changes will also cause a rebuild. deriving (Eq,Ord,Show,Read,Typeable,Data,Enum,Bounded) -- | Information about the current state of the build, obtained by either passing a callback function -- to 'Development.Shake.shakeProgress' (asynchronous output) or 'Development.Shake.getProgress' -- (synchronous output). Typically a build system will pass 'progressDisplay' to 'Development.Shake.shakeProgress', -- which will poll this value and produce status messages. data Progress = Progress -- In retrospect shakeProgress should have been done differently, as a feature you turn on in Rules -- but easiest way around that for now is put the Progress type in Options {isFailure :: !(Maybe String) -- ^ Starts out 'Nothing', becomes 'Just' a target name if a rule fails. ,countSkipped :: {-# UNPACK #-} !Int -- ^ Number of rules which were required, but were already in a valid state. ,countBuilt :: {-# UNPACK #-} !Int -- ^ Number of rules which were have been built in this run. ,countUnknown :: {-# UNPACK #-} !Int -- ^ Number of rules which have been built previously, but are not yet known to be required. ,countTodo :: {-# UNPACK #-} !Int -- ^ Number of rules which are currently required (ignoring dependencies that do not change), but not built. ,timeSkipped :: {-# UNPACK #-} !Double -- ^ Time spent building 'countSkipped' rules in previous runs. ,timeBuilt :: {-# UNPACK #-} !Double -- ^ Time spent building 'countBuilt' rules. ,timeUnknown :: {-# UNPACK #-} !Double -- ^ Time spent building 'countUnknown' rules in previous runs. ,timeTodo :: {-# UNPACK #-} !(Double,Int) -- ^ Time spent building 'countTodo' rules in previous runs, plus the number which have no known time (have never been built before). } deriving (Eq,Ord,Show,Read,Data,Typeable) instance Semigroup Progress where a <> b = Progress {isFailure = isFailure a `mplus` isFailure b ,countSkipped = countSkipped a + countSkipped b ,countBuilt = countBuilt a + countBuilt b ,countUnknown = countUnknown a + countUnknown b ,countTodo = countTodo a + countTodo b ,timeSkipped = timeSkipped a + timeSkipped b ,timeBuilt = timeBuilt a + timeBuilt b ,timeUnknown = timeUnknown a + timeUnknown b ,timeTodo = let (a1,a2) = timeTodo a; (b1,b2) = timeTodo b x1 = a1 + b1; x2 = a2 + b2 in x1 `seq` x2 `seq` (x1,x2) } instance Monoid Progress where mempty = Progress Nothing 0 0 0 0 0 0 0 (0,0) mappend = (<>) -- | Options to control the execution of Shake, usually specified by overriding fields in -- 'shakeOptions': -- -- @ 'shakeOptions'{'shakeThreads'=4, 'shakeReport'=[\"report.html\"]} @ -- -- The 'Data' instance for this type reports the 'shakeProgress' and 'shakeOutput' fields as having the abstract type 'Hidden', -- because 'Data' cannot be defined for functions or 'TypeRep's. data ShakeOptions = ShakeOptions {shakeFiles :: FilePath -- ^ Defaults to @.shake@. The directory used for storing Shake metadata files. -- All metadata files will be named @'shakeFiles'\/.shake./file-name/@, for some @/file-name/@. -- If the 'shakeFiles' directory does not exist it will be created. -- If set to @\"\/dev\/null\"@ then no shakeFiles are read or written (even on Windows). ,shakeThreads :: Int -- ^ Defaults to @1@. Maximum number of rules to run in parallel, similar to @make --jobs=/N/@. -- For many build systems, a number equal to or slightly less than the number of physical processors -- works well. Use @0@ to match the detected number of processors (when @0@, 'getShakeOptions' will -- return the number of threads used). ,shakeVersion :: String -- ^ Defaults to @"1"@. The version number of your build rules. -- Change the version number to force a complete rebuild, such as when making -- significant changes to the rules that require a wipe. The version number should be -- set in the source code, and not passed on the command line. ,shakeVerbosity :: Verbosity -- ^ Defaults to 'Info'. What level of messages should be printed out. ,shakeStaunch :: Bool -- ^ Defaults to 'False'. Operate in staunch mode, where building continues even after errors, -- similar to @make --keep-going@. ,shakeReport :: [FilePath] -- ^ Defaults to @[]@. Write a profiling report to a file, showing which rules rebuilt, -- why, and how much time they took. Useful for improving the speed of your build systems. -- If the file extension is @.json@ it will write JSON data; if @.js@ it will write Javascript; -- if @.trace@ it will write trace events (load into @about:\/\/tracing@ in Chrome); -- otherwise it will write HTML. ,shakeLint :: Maybe Lint -- ^ Defaults to 'Nothing'. Perform sanity checks during building, see 'Lint' for details. ,shakeLintInside :: [FilePath] -- ^ Directories in which the files will be tracked by the linter. ,shakeLintIgnore :: [FilePattern] -- ^ File patterns which are ignored from linter tracking, a bit like calling 'Development.Shake.trackAllow' in every rule. ,shakeLintWatch :: [FilePattern] -- ^ File patterns whose modification causes an error. Raises an error even if 'shakeLint' is 'Nothing'. ,shakeCommandOptions :: [CmdOption] -- ^ Defaults to @[]@. Additional options to be passed to all command invocations. ,shakeFlush :: Maybe Seconds -- ^ Defaults to @'Just' 10@. How often to flush Shake metadata files in seconds, or 'Nothing' to never flush explicitly. -- It is possible that on abnormal termination (not Haskell exceptions) any rules that completed in the last -- 'shakeFlush' seconds will be lost. ,shakeRebuild :: [(Rebuild, FilePattern)] -- ^ What to rebuild ,shakeAbbreviations :: [(String,String)] -- ^ Defaults to @[]@. A list of substrings that should be abbreviated in status messages, and their corresponding abbreviation. -- Commonly used to replace the long paths (e.g. @.make\/i586-linux-gcc\/output@) with an abbreviation (e.g. @$OUT@). ,shakeStorageLog :: Bool -- ^ Defaults to 'False'. Write a message to @'shakeFiles'\/.shake.storage.log@ whenever a storage event happens which may impact -- on the current stored progress. Examples include database version number changes, database compaction or corrupt files. ,shakeLineBuffering :: Bool -- ^ Defaults to 'True'. Change 'stdout' and 'stderr' to line buffering while running Shake. ,shakeTimings :: Bool -- ^ Defaults to 'False'. Print timing information for each stage at the end. ,shakeRunCommands :: Bool -- ^ Default to 'True'. Should you run command line actions, set to 'False' to skip actions whose output streams and exit code -- are not used. Useful for profiling the non-command portion of the build system. ,shakeChange :: Change -- ^ Default to 'ChangeModtime'. How to check if a file has changed, see 'Change' for details. ,shakeCreationCheck :: Bool -- ^ Default to 'True'. After running a rule to create a file, is it an error if the file does not exist. -- Provided for compatibility with @make@ and @ninja@ (which have ugly file creation semantics). ,shakeLiveFiles :: [FilePath] -- ^ Default to @[]@. After the build system completes, write a list of all files which were /live/ in that run, -- i.e. those which Shake checked were valid or rebuilt. Produces best answers if nothing rebuilds. ,shakeVersionIgnore :: Bool -- ^ Defaults to 'False'. Ignore any differences in 'shakeVersion'. ,shakeColor :: Bool -- ^ Defaults to 'False'. Whether to colorize the output. ,shakeShare :: Maybe FilePath -- ^ Defaults to 'Nothing'. Whether to use and store outputs in a shared directory. ,shakeCloud :: [String] -- ^ Defaults to @[]@. Cloud servers to talk to forming a shared cache. ,shakeSymlink :: Bool -- ^ Defaults to @False@. Use symlinks for 'shakeShare' if they are available. -- If this setting is @True@ (even if symlinks are not available) then files will be -- made read-only to avoid inadvertantly poisoning the shared cache. -- Note the links are actually hard links, not symlinks. ,shakeNeedDirectory :: Bool -- ^ Defaults to @False@. Is depending on a directory an error (default), or it is permitted with -- undefined results. Provided for compatibility with @ninja@. ,shakeAllowRedefineRules :: Bool -- ^ Whether to allow calling addBuiltinRule for the same key more than once ,shakeProgress :: IO Progress -> IO () -- ^ Defaults to no action. A function called when the build starts, allowing progress to be reported. -- The function is called on a separate thread, and that thread is killed when the build completes. -- For applications that want to display progress messages, 'progressSimple' is often sufficient, but more advanced -- users should look at the 'Progress' data type. ,shakeOutput :: Verbosity -> String -> IO () -- ^ Defaults to writing using 'putStrLn'. A function called to output messages from Shake, along with the 'Verbosity' at -- which that message should be printed. This function will be called atomically from all other 'shakeOutput' functions. -- The 'Verbosity' will always be greater than or higher than 'shakeVerbosity'. ,shakeTrace :: String -> String -> Bool -> IO () -- ^ Defaults to doing nothing. -- Called for each call of 'Development.Shake.traced', with the key, the command and 'True' for starting, 'False' for stopping. ,shakeExtra :: Map.HashMap TypeRep Dynamic -- ^ This a map which can be used to store arbitrary extra information that a user may need when writing rules. -- The key of each entry must be the 'dynTypeRep' of the value. -- Insert values using 'addShakeExtra' and retrieve them using 'getShakeExtra'. -- The correct way to use this field is to define a hidden newtype for the key, so that conflicts cannot occur. } deriving Typeable -- | The default set of 'ShakeOptions'. shakeOptions :: ShakeOptions shakeOptions = ShakeOptions ".shake" 1 "1" Info False [] Nothing [] [] [] [] (Just 10) [] [] False True False True ChangeModtime True [] False False Nothing [] False False False (const $ pure ()) (const $ BS.putStrLn . UTF8.fromString) -- try and output atomically using BS (\_ _ _ -> pure ()) Map.empty fieldsShakeOptions = ["shakeFiles", "shakeThreads", "shakeVersion", "shakeVerbosity", "shakeStaunch", "shakeReport" ,"shakeLint", "shakeLintInside", "shakeLintIgnore", "shakeLintWatch", "shakeCommandOptions" ,"shakeFlush", "shakeRebuild", "shakeAbbreviations", "shakeStorageLog" ,"shakeLineBuffering", "shakeTimings", "shakeRunCommands", "shakeChange", "shakeCreationCheck" ,"shakeLiveFiles", "shakeVersionIgnore", "shakeColor", "shakeShare", "shakeCloud", "shakeSymlink" ,"shakeNeedDirectory", "shakeCanRedefineRules" ,"shakeProgress", "shakeOutput", "shakeTrace", "shakeExtra"] tyShakeOptions = mkDataType "Development.Shake.Types.ShakeOptions" [conShakeOptions] conShakeOptions = mkConstr tyShakeOptions "ShakeOptions" fieldsShakeOptions Prefix unhide x1 x2 x3 x4 x5 x6 x7 x8 x9 x10 x11 x12 x13 x14 x15 x16 x17 x18 x19 x20 x21 x22 x23 x24 x25 x26 x27 x28 y1 y2 y3 y4 = ShakeOptions x1 x2 x3 x4 x5 x6 x7 x8 x9 x10 x11 x12 x13 x14 x15 x16 x17 x18 x19 x20 x21 x22 x23 x24 x25 x26 x27 x28 (fromHidden y1) (fromHidden y2) (fromHidden y3) (fromHidden y4) instance Data ShakeOptions where gfoldl k z (ShakeOptions x1 x2 x3 x4 x5 x6 x7 x8 x9 x10 x11 x12 x13 x14 x15 x16 x17 x18 x19 x20 x21 x22 x23 x24 x25 x26 x27 x28 y1 y2 y3 y4) = z unhide `k` x1 `k` x2 `k` x3 `k` x4 `k` x5 `k` x6 `k` x7 `k` x8 `k` x9 `k` x10 `k` x11 `k` x12 `k` x13 `k` x14 `k` x15 `k` x16 `k` x17 `k` x18 `k` x19 `k` x20 `k` x21 `k` x22 `k` x23 `k` x24 `k` x25 `k` x26 `k` x27 `k` x28 `k` Hidden y1 `k` Hidden y2 `k` Hidden y3 `k` Hidden y4 gunfold k z _ = k $ k $ k $ k $ k $ k $ k $ k $ k $ k $ k $ k $ k $ k $ k $ k $ k $ k $ k $ k $ k $ k $ k $ k $ k $ k $ k $ k $ k $ k $ k $ k $ z unhide toConstr ShakeOptions{} = conShakeOptions dataTypeOf _ = tyShakeOptions shakeOptionsFields :: ShakeOptions -> [(String, String)] shakeOptionsFields = zipExact fieldsShakeOptions . gmapQ f where f x | Just x <- cast x = show (x :: Int) | Just x <- cast x = show (x :: FilePath) | Just x <- cast x = show (x :: Verbosity) | Just x <- cast x = show (x :: Change) | Just x <- cast x = show (x :: Bool) | Just x <- cast x = show (x :: [FilePath]) | Just x <- cast x = show (x :: [(Rebuild, FilePattern)]) | Just x <- cast x = show (x :: Maybe Lint) | Just x <- cast x = show (x :: Maybe Double) | Just x <- cast x = show (x :: Maybe String) | Just x <- cast x = show (x :: [(String,String)]) | Just x <- cast x = show (x :: Hidden (IO Progress -> IO ())) | Just x <- cast x = show (x :: Hidden (Verbosity -> String -> IO ())) | Just x <- cast x = show (x :: Hidden (Map.HashMap TypeRep Dynamic)) | Just x <- cast x = show (x :: Hidden (String -> String -> Bool -> IO ())) | Just x <- cast x = show (x :: [CmdOption]) | otherwise = throwImpure $ errorInternal $ "Error while showing ShakeOptions, missing alternative for " ++ show (typeOf x) instance Show ShakeOptions where show x = "ShakeOptions {" ++ intercalate ", " (map (\(a,b) -> a ++ " = " ++ b) $ shakeOptionsFields x) ++ "}" -- | Internal type, copied from Hide in Uniplate newtype Hidden a = Hidden {fromHidden :: a} deriving Typeable instance Show (Hidden a) where show _ = "" instance Typeable a => Data (Hidden a) where gfoldl _ z = z gunfold _ _ _ = error "Development.Shake.Types.ShakeProgress: gunfold not implemented - data type has no constructors" toConstr _ = error "Development.Shake.Types.ShakeProgress: toConstr not implemented - data type has no constructors" dataTypeOf _ = tyHidden tyHidden = mkDataType "Development.Shake.Types.Hidden" [] -- | The verbosity data type, used by 'shakeVerbosity'. data Verbosity = Silent -- ^ Don't print any messages. | Error -- ^ Only print error messages. | Warn -- ^ Print errors and warnings. | Info -- ^ Print errors, warnings and @# /command-name/ (for /file-name/)@ when running a 'Development.Shake.traced' command. | Verbose -- ^ Print errors, warnings, full command lines when running a 'Development.Shake.command' or -- 'Development.Shake.cmd' command and status messages when starting a rule. | Diagnostic -- ^ Print messages for virtually everything (mostly for debugging). deriving (Eq,Ord,Show,Read,Typeable,Data,Enum,Bounded) -- | Apply the 'shakeRebuild' flags to a file, determining the desired behaviour shakeRebuildApply :: ShakeOptions -> (FilePath -> Rebuild) shakeRebuildApply ShakeOptions{shakeRebuild=rs} | null rs = const RebuildNormal | otherwise = \x -> fromMaybe RebuildNormal $ firstJust (\(r,pat) -> if pat x then Just r else Nothing) rs2 where rs2 = map (second (?==)) $ reverse rs shakeAbbreviationsApply :: ShakeOptions -> String -> String shakeAbbreviationsApply ShakeOptions{shakeAbbreviations=abbrev} | null abbrev = id | otherwise = f where -- order so longer abbreviations are preferred ordAbbrev = sortOn (negate . length . fst) abbrev f [] = [] f x | (to,rest):_ <- [(to,rest) | (from,to) <- ordAbbrev, Just rest <- [stripPrefix from x]] = to ++ f rest f (x:xs) = x : f xs shake-0.19.8/src/Development/Shake/Internal/Paths.hs0000644000000000000000000000620507346545000020373 0ustar0000000000000000{-# LANGUAGE CPP #-} #ifdef FILE_EMBED {-# LANGUAGE TemplateHaskell #-} #endif -- | The information from Paths_shake cleaned up module Development.Shake.Internal.Paths( shakeVersionString, initDataDirectory, hasManualData, copyManualData, readDataFileHTML ) where import Control.Monad.Extra import Data.Version import System.FilePath import General.Extra import qualified Data.ByteString.Lazy as LBS import Paths_shake #ifdef FILE_EMBED import qualified Data.ByteString as BS import Data.FileEmbed #else import Control.Exception import System.Directory import System.Info.Extra import System.IO.Unsafe import System.Environment #endif shakeVersionString :: String shakeVersionString = showVersion version #ifdef FILE_EMBED initDataDirectory :: IO () initDataDirectory = pure () htmlDataFiles :: [(FilePath, BS.ByteString)] htmlDataFiles = [ ("profile.html", $(embedFile "html/profile.html")) , ("progress.html", $(embedFile "html/progress.html")) , ("shake.js", $(embedFile "html/shake.js")) ] readDataFileHTML :: FilePath -> IO LBS.ByteString readDataFileHTML file = do case lookup file htmlDataFiles of Nothing -> fail $ "Could not find data file " ++ file ++ " in embedded data files!" Just x -> pure (LBS.fromStrict x) manualDirData :: [(FilePath, BS.ByteString)] manualDirData = $(embedDir "docs/manual") hasManualData :: IO Bool hasManualData = pure True copyManualData :: FilePath -> IO () copyManualData dest = do createDirectoryRecursive dest forM_ manualDirData $ \(file, bs) -> do BS.writeFile (dest file) bs #else -- We want getDataFileName to be relative to the current directory on program startup, -- even if we issue a change directory command. Therefore, first call caches, future ones read. {-# NOINLINE dataDirs #-} dataDirs :: [String] dataDirs = unsafePerformIO $ do datdir <- getDataDir exedir <- takeDirectory <$> getExecutablePath `catchIO` \_ -> pure "" curdir <- getCurrentDirectory pure $ [datdir] ++ [exedir | exedir /= ""] ++ [curdir] -- The data files may be located relative to the current directory, if so cache it in advance initDataDirectory :: IO () initDataDirectory = void $ evaluate dataDirs getDataFile :: FilePath -> IO FilePath getDataFile file = do let poss = map ( file) dataDirs res <- filterM doesFileExist_ poss case res of [] -> fail $ unlines $ ("Could not find data file " ++ file ++ ", looked in:") : map (" " ++) poss x:_ -> pure x hasDataFile :: FilePath -> IO Bool hasDataFile file = anyM (\dir -> doesFileExist_ $ dir file) dataDirs readDataFileHTML :: FilePath -> IO LBS.ByteString readDataFileHTML file = LBS.readFile =<< getDataFile ("html" file) manualFiles :: [FilePath] manualFiles = map ("docs/manual" ) ["Shakefile.hs","main.c","constants.c","constants.h","build" <.> if isWindows then "bat" else "sh"] hasManualData :: IO Bool hasManualData = allM hasDataFile manualFiles copyManualData :: FilePath -> IO () copyManualData dest = do createDirectoryRecursive dest forM_ manualFiles $ \file -> do src <- getDataFile file copyFile src (dest takeFileName file) #endif shake-0.19.8/src/Development/Shake/Internal/Profile.hs0000644000000000000000000002122007346545000020706 0ustar0000000000000000{-# LANGUAGE PatternGuards, RecordWildCards #-} module Development.Shake.Internal.Profile(writeProfile) where import General.Template import Data.Tuple.Extra import Data.Function import Data.List.Extra import Data.Maybe import System.FilePath import System.IO.Extra import Numeric.Extra import General.Extra import Development.Shake.Internal.Errors import Development.Shake.Internal.Core.Types import Development.Shake.Internal.Core.Database import Development.Shake.Internal.Value import qualified Data.HashSet as Set import Development.Shake.Internal.Paths import Development.Shake.Classes import System.Time.Extra import qualified Data.HashMap.Strict as Map import qualified Data.ByteString.Lazy.Char8 as LBS import qualified Data.ByteString.Char8 as BS import General.Intern(Id) -- | Given a map of representing a dependency order (with a show for error messages), find an ordering for the items such -- that no item points to an item before itself. -- Raise an error if you end up with a cycle. dependencyOrder :: (Eq a, Hashable a) => (a -> String) -> Map.HashMap a [a] -> [a] -- Algorithm: -- Divide everyone up into those who have no dependencies [Id] -- And those who depend on a particular Id, Dep :-> Maybe [(Key,[Dep])] -- Where d :-> Just (k, ds), k depends on firstly d, then remaining on ds -- For each with no dependencies, add to list, then take its dep hole and -- promote them either to Nothing (if ds == []) or into a new slot. -- k :-> Nothing means the key has already been freed dependencyOrder shw status = f (map fst noDeps) $ Map.map Just $ Map.fromListWith (++) [(d, [(k,ds)]) | (k,d:ds) <- hasDeps] where (noDeps, hasDeps) = partition (null . snd) $ Map.toList status f [] mp | null bad = [] | otherwise = throwImpure $ errorInternal $ unlines $ "Internal invariant broken, database seems to be cyclic" : map (" " ++) bad ++ ["... plus " ++ show (length badOverflow) ++ " more ..." | not $ null badOverflow] where (bad,badOverflow) = splitAt 10 [shw i | (i, Just _) <- Map.toList mp] f (x:xs) mp = x : f (now++xs) later where Just free = Map.lookupDefault (Just []) x mp (now,later) = foldl' g ([], Map.insert x Nothing mp) free g (free, mp) (k, []) = (k:free, mp) g (free, mp) (k, d:ds) = case Map.lookupDefault (Just []) d mp of Nothing -> g (free, mp) (k, ds) Just todo -> (free, Map.insert d (Just $ (k,ds) : todo) mp) -- | Eliminate all errors from the database, pretending they don't exist resultsOnly :: Map.HashMap Id (Key, Status) -> Map.HashMap Id (Key, Result (Either BS.ByteString Value)) resultsOnly mp = Map.map (\(k, v) -> (k, let Just r = getResult v in r{depends = map (Depends . filter (isJust . flip Map.lookup keep) . fromDepends) $ depends r})) keep where keep = Map.filter (isJust . getResult . snd) mp removeStep :: Map.HashMap Id (Key, Result a) -> Map.HashMap Id (Key, Result a) removeStep = Map.filter (\(k,_) -> k /= stepKey) toReport :: Database -> IO [ProfileEntry] toReport db = do status <- removeStep . resultsOnly <$> getKeyValuesFromId db let order = let shw i = maybe "" (show . fst) $ Map.lookup i status in dependencyOrder shw $ Map.map (concatMap fromDepends . depends . snd) status ids = Map.fromList $ zip order [0..] steps = let xs = Set.toList $ Set.fromList $ concat [[changed, built] | (_,Result{..}) <- Map.elems status] in Map.fromList $ zip (sortBy (flip compare) xs) [0..] f (k, Result{..}) = ProfileEntry {prfName = show k ,prfBuilt = fromStep built ,prfChanged = fromStep changed ,prfDepends = filter (not . null) $ map (mapMaybe (`Map.lookup` ids) . fromDepends) depends ,prfExecution = floatToDouble execution ,prfTraces = map fromTrace $ sortOn traceStart traces } where fromStep i = fromJust $ Map.lookup i steps fromTrace (Trace a b c) = ProfileTrace (BS.unpack a) (floatToDouble b) (floatToDouble c) pure [maybe (throwImpure $ errorInternal "toReport") f $ Map.lookup i status | i <- order] data ProfileEntry = ProfileEntry {prfName :: String, prfBuilt :: Int, prfChanged :: Int, prfDepends :: [[Int]], prfExecution :: Double, prfTraces :: [ProfileTrace]} data ProfileTrace = ProfileTrace {prfCommand :: String, prfStart :: Double, prfStop :: Double} prfTime ProfileTrace{..} = prfStop - prfStart -- | Generates an report given some build system profiling data. writeProfile :: FilePath -> Database -> IO () writeProfile out db = writeProfileInternal out =<< toReport db writeProfileInternal :: FilePath -> [ProfileEntry] -> IO () writeProfileInternal out xs | takeExtension out == ".js" = writeFileBinary out $ "var profile = \n" ++ generateJSON xs | takeExtension out == ".json" = writeFileBinary out $ generateJSON xs | takeExtension out == ".trace" = writeFileBinary out $ generateTrace xs | out == "-" = putStr $ unlines $ generateSummary xs -- NOTE: On my laptop writing 1.5Mb of profile report takes 0.6s. -- This is fundamentals of my laptop, not a Haskell profiling issue. -- Verified with similar "type foo > bar" commands taking similar time. | otherwise = LBS.writeFile out =<< generateHTML xs generateSummary :: [ProfileEntry] -> [String] generateSummary xs = ["* This database has tracked " ++ show (maximum (0 : map prfChanged xs) + 1) ++ " runs." ,let f = show . length in "* There are " ++ f xs ++ " rules (" ++ f ls ++ " rebuilt in the last run)." ,let f = show . sum . map (length . prfTraces) in "* Building required " ++ f xs ++ " traced commands (" ++ f ls ++ " in the last run)." ,"* The total (unparallelised) time is " ++ showDuration (sum $ map prfExecution xs) ++ " of which " ++ showDuration (sum $ map prfTime $ concatMap prfTraces xs) ++ " is traced commands." ,let f xs = if null xs then "0s" else (\(a,b) -> showDuration a ++ " (" ++ b ++ ")") $ maximumBy' (compare `on` fst) xs in "* The longest rule takes " ++ f (map (prfExecution &&& prfName) xs) ++ ", and the longest traced command takes " ++ f (map (prfTime &&& prfCommand) $ concatMap prfTraces xs) ++ "." ,let sumLast = sum $ map prfTime $ concatMap prfTraces ls maxStop = maximum $ 0 : map prfStop (concatMap prfTraces ls) in "* Last run gave an average parallelism of " ++ showDP 2 (if maxStop == 0 then 0 else sumLast / maxStop) ++ " times over " ++ showDuration maxStop ++ "." ] where ls = filter ((==) 0 . prfBuilt) xs generateHTML :: [ProfileEntry] -> IO LBS.ByteString generateHTML xs = do report <- readDataFileHTML "profile.html" let f "data/profile-data.js" = pure $ LBS.pack $ "var profile =\n" ++ generateJSON xs runTemplate f report generateTrace :: [ProfileEntry] -> String generateTrace xs = jsonListLines $ showEntries 0 [y{prfCommand=prfName x} | x <- onlyLast, y <- prfTraces x] ++ showEntries 1 (concatMap prfTraces onlyLast) where onlyLast = filter (\x -> prfBuilt x == 0) xs showEntries pid xs = map (showEntry pid) $ snd $ mapAccumL alloc [] $ sortOn prfStart xs alloc :: [ProfileTrace] -> ProfileTrace -> ([ProfileTrace], (Int, ProfileTrace)) -- FIXME: I don't really understand what this code is doing, or the invariants it ensures alloc as r | (a1,_:a2) <- break (\a -> prfStop a <= prfStart r) as = (a1++r:a2, (length a1,r)) | otherwise = (as++[r], (length as,r)) showEntry pid (tid, ProfileTrace{..}) = jsonObject [("args","{}"), ("ph",show "X"), ("cat",show "target") ,("name",show prfCommand), ("tid",show tid), ("pid",show pid) ,("ts",show $ 1000000*prfStart), ("dur",show $ 1000000*(prfStop-prfStart))] generateJSON :: [ProfileEntry] -> String generateJSON = jsonListLines . map showEntry where showEntry ProfileEntry{..} = jsonList $ [show prfName ,showTime prfExecution ,show prfBuilt ,show prfChanged] ++ [show prfDepends | not (null prfDepends) || not (null prfTraces)] ++ [jsonList $ map showTrace prfTraces | not (null prfTraces)] showTrace ProfileTrace{..} = jsonList [show prfCommand, showTime prfStart, showTime prfStop] showTime x = if '.' `elem` y then dropWhileEnd (== '.') $ dropWhileEnd (== '0') y else y where y = showDP 4 x jsonListLines xs = "[" ++ intercalate "\n," xs ++ "\n]" jsonList xs = "[" ++ intercalate "," xs ++ "]" jsonObject xs = "{" ++ intercalate "," [show a ++ ":" ++ b | (a,b) <- xs] ++ "}" shake-0.19.8/src/Development/Shake/Internal/Progress.hs0000644000000000000000000003437707346545000021133 0ustar0000000000000000{-# LANGUAGE RecordWildCards, CPP, ViewPatterns, ForeignFunctionInterface, TupleSections #-} -- | Progress tracking module Development.Shake.Internal.Progress( progress, progressSimple, progressDisplay, progressTitlebar, progressProgram, ProgressEntry(..), progressReplay, writeProgressReport -- INTERNAL USE ONLY ) where import Control.Applicative import Data.Tuple.Extra import Control.Exception.Extra import Control.Monad.Extra import System.Directory import System.Process import System.FilePath import Data.Char import Data.IORef import Data.List import Data.Maybe import Development.Shake.Internal.Options import Development.Shake.Internal.Core.Types import Development.Shake.Internal.Core.Database import qualified Data.ByteString.Char8 as BS import qualified Data.ByteString.Lazy.Char8 as LBS import Numeric.Extra import General.Template import General.EscCodes import General.Extra import Development.Shake.Internal.Paths import System.Time.Extra #ifdef mingw32_HOST_OS import Foreign.C.String #ifdef x86_64_HOST_ARCH #define CALLCONV ccall #else #define CALLCONV stdcall #endif foreign import CALLCONV "Windows.h SetConsoleTitleW" c_setConsoleTitleW :: CWString -> IO Bool #endif --------------------------------------------------------------------- -- PROGRESS progress :: Database -> Step -> IO Progress progress db step = do xs <- getKeyValues db pure $! foldl' f mempty $ map snd xs where g = floatToDouble f s (Ready Result{..}) = if step == built then s{countBuilt = countBuilt s + 1, timeBuilt = timeBuilt s + g execution} else s{countSkipped = countSkipped s + 1, timeSkipped = timeSkipped s + g execution} f s (Loaded Result{..}) = s{countUnknown = countUnknown s + 1, timeUnknown = timeUnknown s + g execution} f s (Running _ r) = let (d,c) = timeTodo s t | Just Result{..} <- r = let d2 = d + g execution in d2 `seq` (d2,c) | otherwise = let c2 = c + 1 in c2 `seq` (d,c2) in s{countTodo = countTodo s + 1, timeTodo = t} f s _ = s --------------------------------------------------------------------- -- MEALY TYPE - for writing the progress functions -- See -- | A machine that takes inputs and produces outputs newtype Mealy i a = Mealy {runMealy :: i -> (a, Mealy i a)} instance Functor (Mealy i) where fmap f (Mealy m) = Mealy $ \i -> case m i of (x, m) -> (f x, fmap f m) instance Applicative (Mealy i) where pure x = let r = Mealy (const (x, r)) in r Mealy mf <*> Mealy mx = Mealy $ \i -> case mf i of (f, mf) -> case mx i of (x, mx) -> (f x, mf <*> mx) echoMealy :: Mealy i i echoMealy = Mealy (,echoMealy) scanMealy :: (a -> b -> a) -> a -> Mealy i b -> Mealy i a scanMealy f z (Mealy m) = Mealy $ \i -> case m i of (x, m) -> let z2 = f z x in (z2, scanMealy f z2 m) --------------------------------------------------------------------- -- MEALY UTILITIES oldMealy :: a -> Mealy i a -> Mealy i (a,a) oldMealy old = scanMealy (\(_,old) new -> (old,new)) (old,old) latch :: Mealy i (Bool, a) -> Mealy i a latch s = fromJust <$> scanMealy f Nothing s where f old (b,v) = Just $ if b then fromMaybe v old else v iff :: Mealy i Bool -> Mealy i a -> Mealy i a -> Mealy i a iff c t f = (\c t f -> if c then t else f) <$> c <*> t <*> f -- decay'd division, compute a/b, with a decay of f -- r' is the new result, r is the last result -- r' ~= a' / b' -- r' = r*b + f*(a'-a) -- ------------- -- b + f*(b'-b) -- when f == 1, r == r' -- -- both streams must only ever increase decay :: Double -> Mealy i Double -> Mealy i Double -> Mealy i Double decay f a b = scanMealy step 0 $ (,) <$> oldMealy 0 a <*> oldMealy 0 b where step r ((a,a'),(b,b')) = if isNaN r then a' / b' else ((r*b) + f*(a'-a)) / (b + f*(b'-b)) --------------------------------------------------------------------- -- MESSAGE GENERATOR formatMessage :: Double -> Double -> String formatMessage secs perc = (if isNaN secs || secs < 0 then "??s" else showMinSec $ ceiling secs) ++ " (" ++ (if isNaN perc || perc < 0 || perc > 100 then "??" else show $ floor perc) ++ "%)" showMinSec :: Int -> String showMinSec secs = (if m == 0 then "" else show m ++ "m" ++ ['0' | s < 10]) ++ show s ++ "s" where (m,s) = divMod secs 60 liftA2' :: Applicative m => m a -> m b -> (a -> b -> c) -> m c liftA2' a b f = liftA2 f a b -- | return (number of seconds, percentage, explanation) message :: Mealy (Double, Progress) (Double, Progress) -> Mealy (Double, Progress) (Double, Double, String) message input = liftA3 (,,) time perc debug where progress = snd <$> input secs = fst <$> input debug = (\donePerSec ruleTime (todoKnown,todoUnknown) -> "Progress: " ++ "((known=" ++ showDP 2 todoKnown ++ "s) + " ++ "(unknown=" ++ show todoUnknown ++ " * time=" ++ showDP 2 ruleTime ++ "s)) " ++ "(rate=" ++ showDP 2 donePerSec ++ "))") <$> donePerSec <*> ruleTime <*> (timeTodo <$> progress) -- Number of seconds work completed in this build run -- Ignores timeSkipped which would be more truthful, but it makes the % drop sharply -- which isn't what users want done = timeBuilt <$> progress -- Work done per second, don't divide by 0 and don't update if 'done' doesn't change donePerSec = iff ((==) 0 <$> done) (pure 1) perSecStable where perSecStable = latch $ liftA2 (,) (uncurry (==) <$> oldMealy 0 done) perSecRaw perSecRaw = decay 1.2 done secs -- Predicted build time for a rule that has never been built before -- The high decay means if a build goes in "phases" - lots of source files, then lots of compiling -- we reach a reasonable number fairly quickly, without bouncing too much ruleTime = liftA2 weightedAverage (f (decay 10) timeBuilt countBuilt) (f (liftA2 (/)) (fst . timeTodo) (\Progress{..} -> countTodo - snd timeTodo)) -- don't call decay on todo, since it goes up and down (as things get done) where weightedAverage (w1,x1) (w2,x2) | w1 == 0 && w2 == 0 = 0 | otherwise = ((w1 *. x1) + (w2 *. x2)) / intToDouble (w1+w2) where i *. d = if i == 0 then 0 else intToDouble i * d -- since d might be NaN f divide time count = let xs = count <$> progress in liftA2 (,) xs $ divide (time <$> progress) (intToDouble <$> xs) -- Number of seconds work remaining, ignoring multiple threads todo = f <$> progress <*> ruleTime where f Progress{..} ruleTime = fst timeTodo + (fromIntegral (snd timeTodo) * ruleTime) -- Display information time = liftA2 (/) todo donePerSec perc = iff ((==) 0 <$> done) (pure 0) $ liftA2' done todo $ \done todo -> 100 * done / (done + todo) --------------------------------------------------------------------- -- EXPOSED FUNCTIONS -- | Given a sampling interval (in seconds) and a way to display the status message, -- produce a function suitable for using as 'Development.Shake.shakeProgress'. -- This function polls the progress information every /n/ seconds, produces a status -- message and displays it using the display function. -- -- Typical status messages will take the form of @1m25s (15%)@, indicating that the build -- is predicted to complete in 1 minute 25 seconds (85 seconds total), and 15% of the necessary build time has elapsed. -- This function uses past observations to predict future behaviour, and as such, is only -- guessing. The time is likely to go up as well as down, and will be less accurate from a -- clean build (as the system has fewer past observations). -- -- The current implementation is to predict the time remaining (based on 'timeTodo') and the -- work already done ('timeBuilt'). The percentage is then calculated as @remaining / (done + remaining)@, -- while time left is calculated by scaling @remaining@ by the observed work rate in this build, -- roughly @done / time_elapsed@. progressDisplay :: Double -> (String -> IO ()) -> IO Progress -> IO () progressDisplay sample disp prog = do disp "Starting..." -- no useful info at this stage time <- offsetTime catchJust (\x -> if x == ThreadKilled then Just () else Nothing) (loop time $ message echoMealy) (const $ do t <- time; disp $ "Finished in " ++ showDuration t) where loop :: IO Double -> Mealy (Double, Progress) (Double, Double, String) -> IO () loop time mealy = do sleep sample p <- prog t <- time ((secs,perc,_debug), mealy)<- pure $ runMealy mealy (t, p) -- putStrLn _debug let done = countSkipped p + countBuilt p let todo = done + countUnknown p + countTodo p disp $ "Running for " ++ showDurationSecs t ++ " [" ++ show done ++ "/" ++ show todo ++ "]" ++ ", predicted " ++ formatMessage secs perc ++ maybe "" (", Failure! " ++) (isFailure p) loop time mealy data ProgressEntry = ProgressEntry {idealSecs :: Double, idealPerc :: Double ,actualSecs :: Double, actualPerc :: Double } isInvalid :: ProgressEntry -> Bool isInvalid ProgressEntry{..} = isNaN actualSecs || isNaN actualPerc -- | Given a list of progress inputs, what would you have suggested (seconds, percentage) progressReplay :: [(Double, Progress)] -> [ProgressEntry] progressReplay [] = [] progressReplay ps = snd $ mapAccumL f (message echoMealy) ps where end = fst $ last ps f a (time,p) = (a2, ProgressEntry (end - time) (time * 100 / end) secs perc) where ((secs,perc,_),a2) = runMealy a (time,p) -- | Given a trace, display information about how well we did writeProgressReport :: FilePath -> [(FilePath, [(Double, Progress)])] -> IO () writeProgressReport out (map (second progressReplay) -> xs) | (bad,_):_ <- filter (any isInvalid . snd) xs = errorIO $ "Progress generates NaN for " ++ bad | takeExtension out == ".js" = writeFile out $ "var shake = \n" ++ generateJSON xs | takeExtension out == ".json" = writeFile out $ generateJSON xs | out == "-" = putStr $ unlines $ generateSummary xs | otherwise = LBS.writeFile out =<< generateHTML xs generateSummary :: [(FilePath, [ProgressEntry])] -> [String] generateSummary xs = flip concatMap xs $ \(file,xs) -> ["# " ++ file, f xs "Seconds" idealSecs actualSecs, f xs "Percent" idealPerc actualPerc] where levels = [100,90,80,50] f xs lbl ideal actual = lbl ++ ": " ++ intercalate ", " [show l ++ "% within " ++ show (ceiling $ maximum $ 0 : take ((length xs * l) `div` 100) diff) | l <- levels] where diff = sort [abs $ ideal x - actual x | x <- xs] generateHTML :: [(FilePath, [ProgressEntry])] -> IO LBS.ByteString generateHTML xs = do report <- readDataFileHTML "progress.html" let f "data/progress-data.js" = pure $ LBS.pack $ "var progress =\n" ++ generateJSON xs runTemplate f report generateJSON :: [(FilePath, [ProgressEntry])] -> String generateJSON = concat . jsonList . map ((++"}") . unlines . f) where f (file,ps) = ("{\"name\":" ++ show (takeFileName file) ++ ", \"values\":") : indent (jsonList $ map g ps) shw = showDP 1 g ProgressEntry{..} = jsonObject [("idealSecs",shw idealSecs),("idealPerc",shw idealPerc) ,("actualSecs",shw actualSecs),("actualPerc",shw actualPerc)] indent = map (" "++) jsonList xs = zipWith (:) ('[':repeat ',') xs ++ ["]"] jsonObject xs = "{" ++ intercalate ", " [show a ++ ":" ++ b | (a,b) <- xs] ++ "}" -- | Set the title of the current console window to the given text. If the -- environment variable @$TERM@ is set to @xterm@ this uses xterm escape sequences. -- On Windows, if not detected as an xterm, this function uses the @SetConsoleTitle@ API. progressTitlebar :: String -> IO () progressTitlebar x = unlessM win lin where #ifdef mingw32_HOST_OS win = withCWString x c_setConsoleTitleW #else win = pure False #endif lin = whenM checkEscCodes $ BS.putStr $ BS.pack $ escWindowTitle x -- | Call the program @shake-progress@ if it is on the @$PATH@. The program is called with -- the following arguments: -- -- * @--title=string@ - the string passed to @progressProgram@. -- -- * @--state=Normal@, or one of @NoProgress@, @Normal@, or @Error@ to indicate -- what state the progress bar should be in. -- -- * @--value=25@ - the percent of the build that has completed, if not in @NoProgress@ state. -- -- The program will not be called consecutively with the same @--state@ and @--value@ options. -- -- Windows 7 or higher users can get taskbar progress notifications by placing the following -- program in their @$PATH@: . progressProgram :: IO (String -> IO ()) progressProgram = do exe <- findExecutable "shake-progress" case exe of Nothing -> pure $ const $ pure () Just exe -> do lastArgs <- newIORef Nothing -- the arguments we passed to shake-progress last time pure $ \msg -> do let failure = " Failure! " `isInfixOf` msg let perc = let (a,b) = break (== '%') msg in if null b then "" else reverse $ takeWhile isDigit $ reverse a let state | perc == "" = "NoProgress" | failure = "Error" | otherwise = "Normal" let args = ["--title=" ++ msg, "--state=" ++ state] ++ ["--value=" ++ perc | perc /= ""] same <- atomicModifyIORef lastArgs $ \old -> (Just args, old == Just args) unless same $ void $ rawSystem exe args -- | A simple method for displaying progress messages, suitable for using as 'Development.Shake.shakeProgress'. -- This function writes the current progress to the titlebar every five seconds using 'progressTitlebar', -- and calls any @shake-progress@ program on the @$PATH@ using 'progressProgram'. progressSimple :: IO Progress -> IO () progressSimple p = do program <- progressProgram progressDisplay 5 (\s -> progressTitlebar s >> program s) p shake-0.19.8/src/Development/Shake/Internal/Resource.hs0000644000000000000000000002003707346545000021102 0ustar0000000000000000{-# LANGUAGE LambdaCase #-} {-# LANGUAGE RecordWildCards, ViewPatterns #-} module Development.Shake.Internal.Resource( Resource, newResourceIO, newThrottleIO, withResource ) where import Data.Function import System.IO.Unsafe import Control.Concurrent.Extra import General.Fence import Control.Exception.Extra import Data.Tuple.Extra import Data.IORef import Control.Monad.Extra import General.Bilist import General.Pool import Development.Shake.Internal.Core.Action import Development.Shake.Internal.Core.Types import Development.Shake.Internal.Core.Monad import Development.Shake.Internal.Core.Pool import Control.Monad.IO.Class import System.Time.Extra {-# NOINLINE resourceId #-} resourceId :: IO Int resourceId = unsafePerformIO resourceCounter -- Work around for GHC bug https://gitlab.haskell.org/ghc/ghc/-/issues/19413 {-# NOINLINE resourceCounter #-} resourceCounter :: IO (IO Int) resourceCounter = do ref <- newIORef 0 pure $ atomicModifyIORef' ref $ \i -> let j = i + 1 in (j, j) -- | Run an action which uses part of a finite resource. For more details see 'Resource'. -- You cannot depend on a rule (e.g. 'need') while a resource is held. withResource :: Resource -> Int -> Action a -> Action a withResource r i act = do Global{..} <- Action getRO liftIO $ globalDiagnostic $ pure $ show r ++ " waiting to acquire " ++ show i fence <- liftIO $ acquireResource r globalPool i whenJust fence $ \fence -> do (offset, ()) <- actionFenceRequeueBy Right fence Action $ modifyRW $ addDiscount offset liftIO $ globalDiagnostic $ pure $ show r ++ " running with " ++ show i Action $ fromAction (blockApply ("Within withResource using " ++ show r) act) `finallyRAW` do liftIO $ releaseResource r globalPool i liftIO $ globalDiagnostic $ pure $ show r ++ " released " ++ show i -- | A type representing an external resource which the build system should respect. There -- are two ways to create 'Resource's in Shake: -- -- * 'Development.Shake.newResource' creates a finite resource, stopping too many actions running -- simultaneously. -- -- * 'Development.Shake.newThrottle' creates a throttled resource, stopping too many actions running -- over a short time period. -- -- These resources are used with 'Development.Shake.withResource' when defining rules. Typically only -- system commands (such as 'Development.Shake.cmd') should be run inside 'Development.Shake.withResource', -- not commands such as 'Development.Shake.need'. -- -- Be careful that the actions run within 'Development.Shake.withResource' do not themselves require further -- resources, or you may get a \"thread blocked indefinitely in an MVar operation\" exception. -- If an action requires multiple resources, use 'Development.Shake.withResources' to avoid deadlock. data Resource = Resource {resourceOrd :: Int -- ^ Key used for Eq/Ord operations. To make withResources work, we require newResourceIO < newThrottleIO ,resourceShow :: String -- ^ String used for Show ,acquireResource :: Pool -> Int -> IO (Maybe (Fence IO ())) -- ^ Acquire the resource and call the function. ,releaseResource :: Pool -> Int -> IO () -- ^ You should only ever releaseResource that you obtained with acquireResource. } instance Show Resource where show = resourceShow instance Eq Resource where (==) = (==) `on` resourceOrd instance Ord Resource where compare = compare `on` resourceOrd --------------------------------------------------------------------- -- FINITE RESOURCES data Finite = Finite {finiteAvailable :: !Int -- ^ number of currently available resources ,finiteWaiting :: Bilist (Int, Fence IO ()) -- ^ queue of people with how much they want and the action when it is allocated to them } -- | A version of 'Development.Shake.newResource' that runs in IO, and can be called before calling 'Development.Shake.shake'. -- Most people should use 'Development.Shake.newResource' instead. newResourceIO :: String -> Int -> IO Resource newResourceIO name mx = do when (mx < 0) $ errorIO $ "You cannot create a resource named " ++ name ++ " with a negative quantity, you used " ++ show mx key <- resourceId var <- newVar $ Finite mx mempty pure $ Resource (negate key) shw (acquire var) (release var) where shw = "Resource " ++ name acquire :: Var Finite -> Pool -> Int -> IO (Maybe (Fence IO ())) acquire var _ want | want < 0 = errorIO $ "You cannot acquire a negative quantity of " ++ shw ++ ", requested " ++ show want | want > mx = errorIO $ "You cannot acquire more than " ++ show mx ++ " of " ++ shw ++ ", requested " ++ show want | otherwise = modifyVar var $ \x@Finite{..} -> if want <= finiteAvailable then pure (x{finiteAvailable = finiteAvailable - want}, Nothing) else do fence <- newFence pure (x{finiteWaiting = finiteWaiting `snoc` (want, fence)}, Just fence) release :: Var Finite -> Pool -> Int -> IO () release var _ i = join $ modifyVar var $ \x -> pure $ f x{finiteAvailable = finiteAvailable x + i} where f (Finite i (uncons -> Just ((wi,wa),ws))) | wi <= i = second (signalFence wa () >>) $ f $ Finite (i-wi) ws | otherwise = first (add (wi,wa)) $ f $ Finite i ws f (Finite i _) = (Finite i mempty, pure ()) add a s = s{finiteWaiting = a `cons` finiteWaiting s} --------------------------------------------------------------------- -- THROTTLE RESOURCES -- call a function after a certain delay waiter :: Seconds -> IO () -> IO () waiter period act = void $ forkIO $ do sleep period act data Throttle -- | Some number of resources are available = ThrottleAvailable !Int -- | Some users are blocked (non-empty), plus an action to call once we go back to Available | ThrottleWaiting (IO ()) (Bilist (Int, Fence IO ())) -- | A version of 'Development.Shake.newThrottle' that runs in IO, and can be called before calling 'Development.Shake.shake'. -- Most people should use 'Development.Shake.newThrottle' instead. newThrottleIO :: String -> Int -> Double -> IO Resource newThrottleIO name count period = do when (count < 0) $ errorIO $ "You cannot create a throttle named " ++ name ++ " with a negative quantity, you used " ++ show count key <- resourceId var <- newVar $ ThrottleAvailable count pure $ Resource key shw (acquire var) (release var) where shw = "Throttle " ++ name acquire :: Var Throttle -> Pool -> Int -> IO (Maybe (Fence IO ())) acquire var pool want | want < 0 = errorIO $ "You cannot acquire a negative quantity of " ++ shw ++ ", requested " ++ show want | want > count = errorIO $ "You cannot acquire more than " ++ show count ++ " of " ++ shw ++ ", requested " ++ show want | otherwise = modifyVar var $ \case ThrottleAvailable i | i >= want -> pure (ThrottleAvailable $ i - want, Nothing) | otherwise -> do stop <- keepAlivePool pool fence <- newFence pure (ThrottleWaiting stop $ (want - i, fence) `cons` mempty, Just fence) ThrottleWaiting stop xs -> do fence <- newFence pure (ThrottleWaiting stop $ xs `snoc` (want, fence), Just fence) release :: Var Throttle -> Pool -> Int -> IO () release var _ n = waiter period $ join $ modifyVar var $ \x -> pure $ case x of ThrottleAvailable i -> (ThrottleAvailable $ i+n, pure ()) ThrottleWaiting stop xs -> f stop n xs where f stop i (uncons -> Just ((wi,wa),ws)) | i >= wi = second (signalFence wa () >>) $ f stop (i-wi) ws | otherwise = (ThrottleWaiting stop $ (wi-i,wa) `cons` ws, pure ()) f stop i _ = (ThrottleAvailable i, stop) shake-0.19.8/src/Development/Shake/Internal/Rules/0000755000000000000000000000000007346545000020047 5ustar0000000000000000shake-0.19.8/src/Development/Shake/Internal/Rules/Default.hs0000644000000000000000000000073507346545000021774 0ustar0000000000000000 module Development.Shake.Internal.Rules.Default( defaultRules ) where import Development.Shake.Internal.Core.Rules import Development.Shake.Internal.Rules.Directory import Development.Shake.Internal.Rules.File import Development.Shake.Internal.Rules.Files import Development.Shake.Internal.Rules.Rerun -- All the rules baked into Shake defaultRules :: Rules () defaultRules = do defaultRuleFile defaultRuleFiles defaultRuleDirectory defaultRuleRerun shake-0.19.8/src/Development/Shake/Internal/Rules/Directory.hs0000644000000000000000000003752707346545000022365 0ustar0000000000000000{-# LANGUAGE GeneralizedNewtypeDeriving, ScopedTypeVariables, DeriveDataTypeable #-} {-# LANGUAGE TypeFamilies, TypeOperators, ConstraintKinds #-} -- | Both System.Directory and System.Environment wrappers module Development.Shake.Internal.Rules.Directory( doesFileExist, doesDirectoryExist, getDirectoryContents, getDirectoryFiles, getDirectoryDirs, getEnv, getEnvWithDefault, getEnvError, removeFiles, removeFilesAfter, getDirectoryFilesIO, defaultRuleDirectory ) where import Control.Exception.Extra import Control.Monad.Extra import Control.Monad.IO.Class import Data.Maybe import Data.Binary import Data.List import Data.Tuple.Extra import qualified Data.HashSet as Set import qualified System.Directory as IO import qualified System.Environment as IO import Development.Shake.Internal.Core.Types import Development.Shake.Internal.Core.Action import Development.Shake.Internal.Core.Rules import Development.Shake.Internal.Core.Build import Development.Shake.Internal.Value import Development.Shake.Classes import Development.Shake.FilePath import Development.Shake.Internal.FilePattern import General.Extra import General.Binary --------------------------------------------------------------------- -- KEY/VALUE TYPES type instance RuleResult DoesFileExistQ = DoesFileExistA newtype DoesFileExistQ = DoesFileExistQ FilePath deriving (Typeable,Eq,Hashable,Binary,BinaryEx,NFData) instance Show DoesFileExistQ where show (DoesFileExistQ a) = "doesFileExist " ++ wrapQuote a newtype DoesFileExistA = DoesFileExistA {fromDoesFileExistA :: Bool} deriving (Typeable,Eq,BinaryEx,NFData) instance Show DoesFileExistA where show (DoesFileExistA a) = show a type instance RuleResult DoesDirectoryExistQ = DoesDirectoryExistA newtype DoesDirectoryExistQ = DoesDirectoryExistQ FilePath deriving (Typeable,Eq,Hashable,Binary,BinaryEx,NFData) instance Show DoesDirectoryExistQ where show (DoesDirectoryExistQ a) = "doesDirectoryExist " ++ wrapQuote a newtype DoesDirectoryExistA = DoesDirectoryExistA {fromDoesDirectoryExistA :: Bool} deriving (Typeable,Eq,BinaryEx,NFData) instance Show DoesDirectoryExistA where show (DoesDirectoryExistA a) = show a type instance RuleResult GetEnvQ = GetEnvA newtype GetEnvQ = GetEnvQ String deriving (Typeable,Eq,Hashable,Binary,BinaryEx,NFData) instance Show GetEnvQ where show (GetEnvQ a) = "getEnv " ++ wrapQuote a newtype GetEnvA = GetEnvA {fromGetEnvA :: Maybe String} deriving (Typeable,Eq,Hashable,BinaryEx,NFData) instance Show GetEnvA where show (GetEnvA a) = maybe "" wrapQuote a type instance RuleResult GetDirectoryContentsQ = GetDirectoryA type instance RuleResult GetDirectoryFilesQ = GetDirectoryA type instance RuleResult GetDirectoryDirsQ = GetDirectoryA newtype GetDirectoryContentsQ = GetDirectoryContentsQ FilePath deriving (Typeable,Eq,Hashable,Binary,BinaryEx,NFData) instance Show GetDirectoryContentsQ where show (GetDirectoryContentsQ dir) = "getDirectoryContents " ++ wrapQuote dir newtype GetDirectoryFilesQ = GetDirectoryFilesQ (FilePath, [FilePattern]) deriving (Typeable,Eq,Hashable,Binary,BinaryEx,NFData) instance Show GetDirectoryFilesQ where show (GetDirectoryFilesQ (dir, pat)) = "getDirectoryFiles " ++ wrapQuote dir ++ " [" ++ unwords (map wrapQuote pat) ++ "]" newtype GetDirectoryDirsQ = GetDirectoryDirsQ FilePath deriving (Typeable,Eq,Hashable,Binary,BinaryEx,NFData) instance Show GetDirectoryDirsQ where show (GetDirectoryDirsQ dir) = "getDirectoryDirs " ++ wrapQuote dir newtype GetDirectoryA = GetDirectoryA {fromGetDirectoryA :: [FilePath]} deriving (Typeable,Eq,Hashable,BinaryEx,NFData) instance Show GetDirectoryA where show (GetDirectoryA xs) = unwords $ map wrapQuote xs --------------------------------------------------------------------- -- RULE DEFINITIONS queryRule :: (RuleResult key ~ value ,BinaryEx witness, Eq witness ,BinaryEx key, ShakeValue key ,Typeable value, NFData value, Show value, Eq value) => (value -> witness) -> (key -> IO value) -> Rules () queryRule witness query = addBuiltinRuleEx (\k old -> do new <- query k pure $ if old == new then Nothing else Just $ show new) (\_ v -> Just $ runBuilder $ putEx $ witness v) (\k old _ -> liftIO $ do new <- query k let wnew = witness new pure $ case old of Just old | wnew == getEx old -> RunResult ChangedNothing old new _ -> RunResult ChangedRecomputeDiff (runBuilder $ putEx wnew) new) defaultRuleDirectory :: Rules () defaultRuleDirectory = do -- for things we are always going to rerun, and which might take up a lot of memory to store, -- we only store their hash, so we can compute change, but not know what changed happened queryRule id (\(DoesFileExistQ x) -> DoesFileExistA <$> IO.doesFileExist x) queryRule id (\(DoesDirectoryExistQ x) -> DoesDirectoryExistA <$> IO.doesDirectoryExist x) queryRule hash (\(GetEnvQ x) -> GetEnvA <$> IO.lookupEnv x) queryRule hash (\(GetDirectoryContentsQ x) -> GetDirectoryA <$> getDirectoryContentsIO x) queryRule hash (\(GetDirectoryFilesQ (a,b)) -> GetDirectoryA <$> getDirectoryFilesIO a b) queryRule hash (\(GetDirectoryDirsQ x) -> GetDirectoryA <$> getDirectoryDirsIO x) --------------------------------------------------------------------- -- RULE ENTRY POINTS -- | Returns 'True' if the file exists. The existence of the file is tracked as a -- dependency, and if the file is created or deleted the rule will rerun in subsequent builds. -- Usually used to implement include paths. For example, given a include path of @foo@ and @bar@, -- and a file @hello.txt@, you might write: -- -- @ -- b <- 'doesFileExist' \"foo\/hello.txt\" -- let file = if b then \"foo\/hello.txt\" else "\bar\/hello.txt\" -- @ -- -- Now if the user had a file @bar\/hello.txt@, and then creates a file @foo\/hello.txt@, the -- rule would correctly rerun, as while the @hello.txt@ that was used didn't change, which -- file should be used has changed. -- -- You should not call 'doesFileExist' on files which can be created by the build system. -- The reason is that Shake operations such as this one are both cached for the duration of the build, -- and may be run preemptively during a recheck. That means you can't control the time at which -- 'doesFileExist' is called. For that to be consistent, 'doesFileExist' must return the same result at the -- start and end of the build, a property that is partially checked by the @--lint@ flag. Given a -- file created by the build system, a build from clean will return 'False' at the beginning and 'True' -- at the end, leading to a change, and thus rebuilds in subsequent runs. -- -- If you do want to know whether a file exists separate to the build system, e.g. you can perfectly -- predict the files contents and can save some meaningful work if the file already exists, you should -- use the untracked "System.Directory" version. Such calls are not tracked by the file system, and you -- should take care not to result in unpredictable results. doesFileExist :: FilePath -> Action Bool doesFileExist = fmap fromDoesFileExistA . apply1 . DoesFileExistQ . toStandard -- | Returns 'True' if the directory exists. The existence of the directory is tracked as a -- dependency, and if the directory is created or delete the rule will rerun in subsequent builds. -- -- You should not call 'doesDirectoryExist' on directories which can be created by the build system, -- for reasons explained in 'doesFileExist'. doesDirectoryExist :: FilePath -> Action Bool doesDirectoryExist = fmap fromDoesDirectoryExistA . apply1 . DoesDirectoryExistQ . toStandard -- | Return 'Just' the value of the environment variable, or 'Nothing' -- if the variable is not set. The environment variable is tracked as a -- dependency, and if it changes the rule will rerun in subsequent builds. -- This function is a tracked version of 'getEnv' / 'lookupEnv' from the base library. -- -- @ -- flags <- getEnv \"CFLAGS\" -- 'cmd' \"gcc -c\" [out] (maybe [] words flags) -- @ getEnv :: String -> Action (Maybe String) getEnv = fmap fromGetEnvA . apply1 . GetEnvQ -- | @'getEnvWithDefault' def var@ returns the value of the environment variable @var@, or the -- default value @def@ if it is not set. Similar to 'getEnv'. -- -- @ -- flags <- getEnvWithDefault \"-Wall\" \"CFLAGS\" -- 'cmd' \"gcc -c\" [out] flags -- @ getEnvWithDefault :: String -> String -> Action String getEnvWithDefault def var = fromMaybe def <$> getEnv var -- | A partial variant of 'getEnv' that returns the environment variable variable or fails. getEnvError :: Partial => String -> Action String getEnvError name = getEnvWithDefault (error $ "getEnvError: Environment variable " ++ name ++ " is undefined") name -- | Get the contents of a directory. The result will be sorted, and will not contain -- the entries @.@ or @..@ (unlike the standard Haskell version). -- The resulting paths will be relative to the first argument. -- The result itself is tracked as a dependency, but the files in the result are not. -- If the list of files changes in subsequent builds any rule calling it will rerun. -- -- It is usually simpler to call either 'getDirectoryFiles' or 'getDirectoryDirs'. getDirectoryContents :: FilePath -> Action [FilePath] getDirectoryContents = fmap fromGetDirectoryA . apply1 . GetDirectoryContentsQ -- | Get the files anywhere under a directory that match any of a set of patterns. -- For the interpretation of the patterns see '?=='. All results will be -- relative to the directory argument. -- The result itself is tracked as a dependency, but the files in the result are not. -- If the list of files changes in subsequent builds any rule calling it will rerun. -- Some examples: -- -- > getDirectoryFiles "Config" ["//*.xml"] -- > -- All .xml files anywhere under the Config directory -- > -- If Config/foo/bar.xml exists it will return ["foo/bar.xml"] -- > getDirectoryFiles "Modules" ["*.hs","*.lhs"] -- > -- All .hs or .lhs in the Modules directory -- > -- If Modules/foo.hs and Modules/foo.lhs exist, it will return ["foo.hs","foo.lhs"] -- -- If you require a qualified file name it is often easier to use @\"\"@ as the 'FilePath' argument, -- for example the following two expressions are equivalent: -- -- > fmap (map ("Config" )) (getDirectoryFiles "Config" ["//*.xml"]) -- > getDirectoryFiles "" ["Config//*.xml"] -- -- If the first argument directory does not exist it will raise an error. -- If @foo@ does not exist, then the first of these error, but the second will not. -- -- > getDirectoryFiles "foo" ["//*"] -- error -- > getDirectoryFiles "" ["foo//*"] -- returns [] -- -- This function is tracked and serves as a dependency. If a rule calls -- @getDirectoryFiles \"\" [\"*.c\"]@ and someone adds @foo.c@ to the -- directory, that rule will rebuild. If someone changes one of the @.c@ files, -- but the /list/ of @.c@ files doesn't change, then it will not rebuild. -- As a consequence of being tracked, if the contents change during the build -- (e.g. you are generating @.c@ files in this directory) then the build not reach -- a stable point, which is an error - detected by running with @--lint@. -- You should normally only call this function returning source files. -- -- For an untracked variant see 'getDirectoryFilesIO'. getDirectoryFiles :: FilePath -> [FilePattern] -> Action [FilePath] getDirectoryFiles dir pat = fmap fromGetDirectoryA $ apply1 $ GetDirectoryFilesQ (dir,pat) -- | Get the directories in a directory, not including @.@ or @..@. -- All directories are relative to the argument directory. -- The result itself is tracked as a dependency, but the directories in the result are not. -- If the list of directories changes in subsequent builds any rule calling it will rerun. -- The rules about creating entries described in 'getDirectoryFiles' also apply here. -- -- > getDirectoryDirs "/Users" -- > -- Return all directories in the /Users directory -- > -- e.g. ["Emily","Henry","Neil"] getDirectoryDirs :: FilePath -> Action [FilePath] getDirectoryDirs = fmap fromGetDirectoryA . apply1 . GetDirectoryDirsQ --------------------------------------------------------------------- -- IO ROUTINES getDirectoryContentsIO :: FilePath -> IO [FilePath] -- getDirectoryContents "" is equivalent to getDirectoryContents "." on Windows, -- but raises an error on Linux. We smooth out the difference. getDirectoryContentsIO dir = fmap (sort . filter (not . all (== '.'))) $ IO.getDirectoryContents $ if dir == "" then "." else dir getDirectoryDirsIO :: FilePath -> IO [FilePath] getDirectoryDirsIO dir = filterM f =<< getDirectoryContentsIO dir where f x = IO.doesDirectoryExist $ dir x -- | A version of 'getDirectoryFiles' that is in IO, and thus untracked. getDirectoryFilesIO :: FilePath -> [FilePattern] -> IO [FilePath] -- Known infelicity: on Windows, if you search for "foo", but have the file "FOO", -- it will match if on its own, or not if it is paired with "*", since that forces -- a full directory scan, and then it uses Haskell equality (case sensitive) getDirectoryFilesIO root pat = f "" $ snd $ walk pat where -- Even after we know they are there because we called contents, we still have to check they are directories/files -- as required f dir (Walk op) = f dir . WalkTo . op =<< getDirectoryContentsIO (root dir) f dir (WalkTo (files, dirs)) = do files <- filterM (IO.doesFileExist . (root )) $ map (dir ) files dirs <- concatMapM (uncurry f) =<< filterM (IO.doesDirectoryExist . (root ) . fst) (map (first (dir )) dirs) pure $ files ++ dirs --------------------------------------------------------------------- -- REMOVE UTILITIES -- | Remove all files and directories that match any of the patterns within a directory. -- Some examples: -- -- @ -- 'removeFiles' \"output\" [\"\/\/*\"] -- delete everything inside \'output\' -- 'removeFiles' \"output\" [\"\/\/\"] -- delete \'output\' itself -- 'removeFiles' \".\" [\"\/\/*.hi\",\"\/\/*.o\"] -- delete all \'.hi\' and \'.o\' files -- @ -- -- If the argument directory is missing no error is raised. -- This function will follow symlinks, so should be used with care. -- -- This function is often useful when writing a @clean@ action for your build system, -- often as a 'phony' rule. removeFiles :: FilePath -> [FilePattern] -> IO () removeFiles dir pat = whenM (IO.doesDirectoryExist dir) $ do let (b,w) = walk pat if b then removeDir dir else f dir w where f dir (Walk op) = f dir . WalkTo . op =<< getDirectoryContentsIO dir f dir (WalkTo (files, dirs)) = do forM_ files $ \fil -> tryIO $ removeItem $ dir fil let done = Set.fromList files forM_ (filter (not . flip Set.member done . fst) dirs) $ \(d,w) -> do let dir2 = dir d whenM (IO.doesDirectoryExist dir2) $ f dir2 w removeItem :: FilePath -> IO () removeItem x = IO.removeFile x `catchIO` \_ -> removeDir x -- In newer GHC's removeDirectoryRecursive is probably better, but doesn't follow -- symlinks, so it's got different behaviour removeDir :: FilePath -> IO () removeDir x = do mapM_ (removeItem . (x )) =<< getDirectoryContentsIO x IO.removeDirectory x -- | Remove files, like 'removeFiles', but executed after the build completes successfully using 'runAfter'. -- Useful for implementing @clean@ actions that delete files Shake may have open for building, e.g. 'shakeFiles'. -- Where possible, delete the files as a normal part of the build, e.g. using @'liftIO' $ 'removeFiles' dir pats@. removeFilesAfter :: FilePath -> [FilePattern] -> Action () removeFilesAfter a b = do putVerbose $ "Will remove " ++ unwords b ++ " from " ++ a runAfter $ removeFiles a b shake-0.19.8/src/Development/Shake/Internal/Rules/File.hs0000644000000000000000000006500107346545000021264 0ustar0000000000000000{-# LANGUAGE GeneralizedNewtypeDeriving, DeriveDataTypeable, ScopedTypeVariables, NamedFieldPuns #-} {-# LANGUAGE ViewPatterns, RecordWildCards, FlexibleInstances, TypeFamilies, ConstraintKinds #-} module Development.Shake.Internal.Rules.File( need, needHasChanged, needBS, needed, neededBS, want, trackRead, trackWrite, trackAllow, produces, defaultRuleFile, (%>), (|%>), (?>), phony, (~>), phonys, resultHasChanged, -- * Internal only FileQ(..), FileA(..), fileStoredValue, fileEqualValue, EqualCost(..), fileForward ) where import Control.Monad.Extra import Control.Monad.IO.Class import Data.Typeable import Data.List import Data.Maybe import qualified Data.ByteString.Char8 as BS import qualified Data.HashSet as Set import Foreign.Storable import Data.Word import Data.Monoid import General.Binary import General.Extra import Development.Shake.Internal.Core.Types import Development.Shake.Internal.Core.Rules import Development.Shake.Internal.Core.Build import Development.Shake.Internal.Core.Action import Development.Shake.Internal.FileName import Development.Shake.Internal.Rules.Rerun import Development.Shake.Classes import Development.Shake.FilePath(toStandard) import Development.Shake.Internal.FilePattern import Development.Shake.Internal.FileInfo import Development.Shake.Internal.Options import Development.Shake.Internal.Errors import System.FilePath(takeDirectory) -- important that this is the system local filepath, or wrong slashes go wrong import System.IO.Unsafe(unsafeInterleaveIO) import Prelude infix 1 %>, ?>, |%>, ~> --------------------------------------------------------------------- -- TYPES type instance RuleResult FileQ = FileR -- | The unique key we use to index File rules, to avoid name clashes. newtype FileQ = FileQ {fromFileQ :: FileName} deriving (Typeable,Eq,Hashable,Binary,BinaryEx,NFData) -- | Raw information about a file. data FileA = FileA {-# UNPACK #-} !ModTime {-# UNPACK #-} !FileSize FileHash deriving (Typeable) -- | Result of a File rule, may contain raw file information and whether the rule did run this build data FileR = FileR { answer :: !(Maybe FileA) -- ^ Raw information about the file built by this rule. -- Set to 'Nothing' for 'phony' files. , useLint :: !Bool -- ^ Should we lint the resulting file } deriving (Typeable) -- | The types of file rule that occur. data Mode = ModePhony (Action ()) -- ^ An action with no file value | ModeDirect (Action ()) -- ^ An action that produces this file | ModeForward (Action (Maybe FileA)) -- ^ An action that looks up a file someone else produced -- | The results of the various 'Mode' rules. data Answer = AnswerPhony | AnswerDirect Ver FileA | AnswerForward Ver FileA -- | The file rules we use, first is the name (as pretty as you can get). data FileRule = FileRule String (FilePath -> Maybe Mode) deriving Typeable --------------------------------------------------------------------- -- INSTANCES instance Show FileQ where show (FileQ x) = fileNameToString x instance BinaryEx [FileQ] where putEx = putEx . map fromFileQ getEx = map FileQ . getEx instance NFData FileA where rnf (FileA a b c) = rnf a `seq` rnf b `seq` rnf c instance NFData FileR where rnf (FileR a b) = rnf a `seq` rnf b instance Show FileA where show (FileA m s h) = "File {mod=" ++ show m ++ ",size=" ++ show s ++ ",digest=" ++ show h ++ "}" instance Show FileR where show FileR{..} = show answer instance Storable FileA where sizeOf _ = 4 * 3 -- 4 Word32's alignment _ = alignment (undefined :: ModTime) peekByteOff p i = FileA <$> peekByteOff p i <*> peekByteOff p (i+4) <*> peekByteOff p (i+8) pokeByteOff p i (FileA a b c) = pokeByteOff p i a >> pokeByteOff p (i+4) b >> pokeByteOff p (i+8) c instance BinaryEx FileA where putEx = putExStorable getEx = getExStorable instance BinaryEx [FileA] where putEx = putExStorableList getEx = getExStorableList fromAnswer :: Answer -> Maybe FileA fromAnswer AnswerPhony = Nothing fromAnswer (AnswerDirect _ x) = Just x fromAnswer (AnswerForward _ x) = Just x instance BinaryEx Answer where putEx AnswerPhony = mempty putEx (AnswerDirect ver x) = putExStorable ver <> putEx x putEx (AnswerForward ver x) = putEx (0 :: Word8) <> putExStorable ver <> putEx x getEx x = case BS.length x of 0 -> AnswerPhony i -> if i == sz then f AnswerDirect x else f AnswerForward $ BS.tail x where sz = sizeOf (undefined :: Ver) + sizeOf (undefined :: FileA) f ctor x = let (a,b) = binarySplit x in ctor a $ getEx b --------------------------------------------------------------------- -- FILE CHECK QUERIES -- | An equality check and a cost. data EqualCost = EqualCheap -- ^ The equality check was cheap. | EqualExpensive -- ^ The equality check was expensive, as the results are not trivially equal. | NotEqual -- ^ The values are not equal. deriving (Eq,Ord,Show,Read,Typeable,Enum,Bounded) fileStoredValue :: ShakeOptions -> FileQ -> IO (Maybe FileA) fileStoredValue ShakeOptions{shakeChange=c, shakeNeedDirectory=allowDir} (FileQ x) = do res <- getFileInfo allowDir x case res of Nothing -> pure Nothing Just (time,size) | c == ChangeModtime -> pure $ Just $ FileA time size noFileHash Just (time,size) -> do hash <- unsafeInterleaveIO $ getFileHash x pure $ Just $ FileA time size hash fileEqualValue :: ShakeOptions -> FileA -> FileA -> EqualCost fileEqualValue ShakeOptions{shakeChange=c} (FileA x1 x2 x3) (FileA y1 y2 y3) = case c of ChangeModtime -> bool $ x1 == y1 ChangeDigest -> bool $ x2 == y2 && x3 == y3 ChangeModtimeOrDigest -> bool $ x1 == y1 && x2 == y2 && x3 == y3 _ | x1 == y1 -> EqualCheap | x2 == y2 && x3 == y3 -> EqualExpensive | otherwise -> NotEqual where bool b = if b then EqualCheap else NotEqual -- | Arguments: options; is the file an input; a message for failure if the file does not exist; filename storedValueError :: ShakeOptions -> Bool -> String -> FileQ -> IO (Maybe FileA) {- storedValueError opts False msg x | False && not (shakeOutputCheck opts) = do when (shakeCreationCheck opts) $ do whenM (isNothing <$> (storedValue opts x :: IO (Maybe FileA))) $ error $ msg ++ "\n " ++ unpackU (fromFileQ x) pure $ FileA fileInfoEq fileInfoEq fileInfoEq -} storedValueError opts input msg x = maybe def Just <$> fileStoredValue opts2 x where def = if shakeCreationCheck opts || input then error err else Nothing err = msg ++ "\n " ++ fileNameToString (fromFileQ x) opts2 = if not input && shakeChange opts == ChangeModtimeAndDigestInput then opts{shakeChange=ChangeModtime} else opts --------------------------------------------------------------------- -- THE DEFAULT RULE defaultRuleFile :: Rules () defaultRuleFile = do opts@ShakeOptions{..} <- getShakeOptionsRules -- A rule from FileQ to (Maybe FileA). The result value is only useful for linting. addBuiltinRuleEx (ruleLint opts) (ruleIdentity opts) (ruleRun opts $ shakeRebuildApply opts) ruleLint :: ShakeOptions -> BuiltinLint FileQ FileR ruleLint opts k (FileR (Just v) True) = do now <- fileStoredValue opts k pure $ case now of Nothing -> Just "" Just now | fileEqualValue opts v now == EqualCheap -> Nothing | otherwise -> Just $ show now ruleLint _ _ _ = pure Nothing ruleIdentity :: ShakeOptions -> BuiltinIdentity FileQ FileR ruleIdentity opts | shakeChange opts == ChangeModtime = throwImpure errorNoHash ruleIdentity _ = \k v -> case answer v of Just (FileA _ size hash) -> Just $ runBuilder $ putExStorable size <> putExStorable hash Nothing -> Nothing ruleRun :: ShakeOptions -> (FilePath -> Rebuild) -> BuiltinRun FileQ FileR ruleRun opts@ShakeOptions{..} rebuildFlags o@(FileQ (fileNameToString -> xStr)) oldBin@(fmap getEx -> old :: Maybe Answer) mode = do -- for One, rebuild makes perfect sense -- for Forward, we expect the child will have already rebuilt - Rebuild just lets us deal with code changes -- for Phony, it doesn't make that much sense, but probably isn't harmful? let r = rebuildFlags xStr (ruleVer, ruleAct, ruleErr) <- getUserRuleInternal o (\(FileRule s _) -> Just s) $ \(FileRule _ f) -> f xStr let verEq v = Just v == ruleVer || case ruleAct of [] -> v == Ver 0; [(v2,_)] -> v == Ver v2; _ -> False let rebuild = do putWhen Verbose $ "# " ++ show o case ruleAct of [] -> rebuildWith Nothing [x] -> rebuildWith $ Just x _ -> throwM ruleErr case old of _ | r == RebuildNow -> rebuild _ | r == RebuildLater -> case old of Just _ -> -- ignoring the currently stored value, which may trigger lint has changed -- so disable lint on this file unLint <$> retOld ChangedNothing Nothing -> do -- i don't have a previous value, so assume this is a source node, and mark rebuild in future now <- liftIO $ fileStoredValue opts o case now of Nothing -> rebuild Just now -> do alwaysRerun; retNew ChangedStore $ AnswerDirect (Ver 0) now {- _ | r == RebuildNever -> do now <- liftIO $ fileStoredValue opts o case now of Nothing -> rebuild Just now -> do let diff | Just (AnswerDirect old) <- old, fileEqualValue opts old now /= NotEqual = ChangedRecomputeSame | otherwise = ChangedRecomputeDiff retNew diff $ AnswerDirect now -} Just (AnswerDirect ver old) | mode == RunDependenciesSame, verEq ver -> do now <- liftIO $ fileStoredValue opts o let noHash (FileA _ _ x) = isNoFileHash x case now of Nothing -> rebuild Just now -> case fileEqualValue opts old now of NotEqual -> rebuild -- if our last build used no file hashing, but this build should, then we must refresh the hash EqualCheap | if noHash old then shakeChange == ChangeModtimeAndDigestInput || noHash now else True -> retOld ChangedNothing _ -> retNew ChangedStore $ AnswerDirect ver now Just (AnswerForward ver _) | verEq ver, mode == RunDependenciesSame -> retOld ChangedNothing _ -> rebuild where -- no need to lint check forward files -- but more than that, it goes wrong if you do, see #427 fileR (AnswerDirect _ x) = FileR (Just x) True fileR (AnswerForward _ x) = FileR (Just x) False fileR AnswerPhony = FileR Nothing False unLint (RunResult a b c) = RunResult a b c{useLint = False} retNew :: RunChanged -> Answer -> Action (RunResult FileR) retNew c v = pure $ RunResult c (runBuilder $ putEx v) $ fileR v retOld :: RunChanged -> Action (RunResult FileR) retOld c = pure $ RunResult c (fromJust oldBin) $ fileR (fromJust old) -- actually run the rebuild rebuildWith act = do let answer ctor new = do let b = case () of _ | Just old <- old , Just old <- fromAnswer old , fileEqualValue opts old new /= NotEqual -> ChangedRecomputeSame _ -> ChangedRecomputeDiff retNew b $ ctor new case act of Nothing -> do new <- liftIO $ storedValueError opts True "Error, file does not exist and no rule available:" o answer (AnswerDirect $ Ver 0) $ fromJust new Just (ver, ModeForward act) -> do new <- act case new of Nothing -> do -- Not 100% sure how you get here, but I think it involves RebuildLater and multi-file rules historyDisable retNew ChangedRecomputeDiff AnswerPhony Just new -> answer (AnswerForward $ Ver ver) new Just (ver, ModeDirect act) -> do cache <- historyLoad ver case cache of Just encodedHash -> do Just (FileA mod size _) <- liftIO $ storedValueError opts False "Error, restored the rule but did not produce file:" o answer (AnswerDirect $ Ver ver) $ FileA mod size $ getExStorable encodedHash Nothing -> do act new <- liftIO $ storedValueError opts False "Error, rule finished running but did not produce file:" o case new of Nothing -> do -- rule ran, but didn't compute an answer, because shakeCreationCheck=False -- I think it should probably not return phony, but return a different valid-but-no-file -- but it's just too rare to bother historyDisable retNew ChangedRecomputeDiff AnswerPhony Just new@(FileA _ _ fileHash) -> do producesUnchecked [xStr] res <- answer (AnswerDirect $ Ver ver) new historySave ver $ runBuilder $ if isNoFileHash fileHash then throwImpure errorNoHash else putExStorable fileHash pure res Just (_, ModePhony act) -> do -- See #523 and #524 -- Shake runs the dependencies first, but stops when one has changed. -- We don't want to run the existing deps first if someone changes the build system, -- so insert a fake dependency that cuts the process dead. alwaysRerun act retNew ChangedRecomputeDiff AnswerPhony apply_ :: Partial => (a -> FileName) -> [a] -> Action [FileR] apply_ f = apply . map (FileQ . f) -- | Has a file changed. This function will only give the correct answer if called in the rule -- producing the file, /before/ the rule has modified the file in question. -- Best avoided, but sometimes necessary in conjunction with 'needHasChanged' to cause rebuilds -- to happen if the result is deleted or modified. resultHasChanged :: FilePath -> Action Bool resultHasChanged file = do let filename = FileQ $ fileNameFromString file res <- getDatabaseValue filename old<- pure $ case result <$> res of Nothing -> Nothing Just (Left bs) -> fromAnswer $ getEx bs Just (Right v) -> answer v case old of Nothing -> pure True Just old -> do opts <- getShakeOptions new <- liftIO $ fileStoredValue opts filename pure $ case new of Nothing -> True Just new -> fileEqualValue opts old new == NotEqual --------------------------------------------------------------------- -- OPTIONS ON TOP -- | Internal method for adding forwarding actions fileForward :: String -> (FilePath -> Maybe (Action (Maybe FileA))) -> Rules () fileForward help act = addUserRule $ FileRule help $ fmap ModeForward . act -- | Add a dependency on the file arguments, ensuring they are built before continuing. -- The file arguments may be built in parallel, in any order. This function is particularly -- necessary when calling 'Development.Shake.cmd' or 'Development.Shake.command'. As an example: -- -- @ -- \"\/\/*.rot13\" '%>' \\out -> do -- let src = 'Development.Shake.FilePath.dropExtension' out -- 'need' [src] -- 'Development.Shake.cmd' \"rot13\" [src] \"-o\" [out] -- @ -- -- Usually @need [foo,bar]@ is preferable to @need [foo] >> need [bar]@ as the former allows greater -- parallelism, while the latter requires @foo@ to finish building before starting to build @bar@. -- -- This function should not be called with wildcards (e.g. @*.txt@ - use 'getDirectoryFiles' to expand them), -- environment variables (e.g. @$HOME@ - use 'getEnv' to expand them) or directories (directories cannot be -- tracked directly - track files within the directory instead). need :: Partial => [FilePath] -> Action () need = withFrozenCallStack $ void . apply_ fileNameFromString -- | Like 'need' but returns a list of rebuilt dependencies since the calling rule last built successfully. -- -- The following example writes a list of changed dependencies to a file as its action. -- -- @ -- \"target\" '%>' \\out -> do -- let sourceList = [\"source1\", \"source2\"] -- rebuildList <- 'needHasChanged' sourceList -- 'Development.Shake.writeFileLines' out rebuildList -- @ -- -- This function can be used to alter the action depending on which dependency needed -- to be rebuild. -- -- Note that a rule can be run even if no dependency has changed, for example -- because of 'shakeRebuild' or because the target has changed or been deleted. -- To detect the latter case you may wish to use 'resultHasChanged'. needHasChanged :: Partial => [FilePath] -> Action [FilePath] needHasChanged paths = withFrozenCallStack $ do apply_ fileNameFromString paths self <- getCurrentKey selfVal <- case self of Nothing -> pure Nothing Just self -> getDatabaseValueGeneric self case selfVal of Nothing -> pure paths -- never build before or not a key, so everything has changed Just selfVal -> flip filterM paths $ \path -> do pathVal <- getDatabaseValue (FileQ $ fileNameFromString path) pure $ case pathVal of Just pathVal | changed pathVal > built selfVal -> True _ -> False needBS :: Partial => [BS.ByteString] -> Action () needBS = withFrozenCallStack $ void . apply_ fileNameFromByteString -- | Like 'need', but if 'shakeLint' is set, check that the file does not rebuild. -- Used for adding dependencies on files that have already been used in this rule. needed :: Partial => [FilePath] -> Action () needed xs = withFrozenCallStack $ do opts <- getShakeOptions if isNothing $ shakeLint opts then need xs else neededCheck $ map fileNameFromString xs neededBS :: Partial => [BS.ByteString] -> Action () neededBS xs = withFrozenCallStack $ do opts <- getShakeOptions if isNothing $ shakeLint opts then needBS xs else neededCheck $ map fileNameFromByteString xs neededCheck :: Partial => [FileName] -> Action () neededCheck xs = withFrozenCallStack $ do opts <- getShakeOptions pre <- liftIO $ mapM (fileStoredValue opts . FileQ) xs post <- apply_ id xs let bad = [ (x, if isJust a then "File change" else "File created") | (x, a, FileR (Just b) _) <- zip3 xs pre post, maybe NotEqual (\a -> fileEqualValue opts a b) a == NotEqual] case bad of [] -> pure () (file,msg):_ -> throwM $ errorStructured "Lint checking error - 'needed' file required rebuilding" [("File", Just $ fileNameToString file) ,("Error",Just msg)] "" -- Either trackRead or trackWrite track :: ([FileQ] -> Action ()) -> [FilePath] -> Action () track tracker xs = do ShakeOptions{shakeLintIgnore} <- getShakeOptions let ignore = (?==*) shakeLintIgnore let ys = filter (not . ignore) xs when (ys /= []) $ tracker $ map (FileQ . fileNameFromString) ys -- | Track that a file was read by the action preceding it. If 'shakeLint' is activated -- then these files must be dependencies of this rule. Calls to 'trackRead' are -- automatically inserted in 'LintFSATrace' mode. trackRead :: [FilePath] -> Action () trackRead = track lintTrackRead -- | Track that a file was written by the action preceding it. If 'shakeLint' is activated -- then these files must either be the target of this rule, or never referred to by the build system. -- Calls to 'trackWrite' are automatically inserted in 'LintFSATrace' mode. trackWrite :: [FilePath] -> Action () trackWrite = track lintTrackWrite -- | Allow accessing a file in this rule, ignoring any subsequent 'trackRead' \/ 'trackWrite' calls matching -- the pattern. trackAllow :: [FilePattern] -> Action () trackAllow ps = do let ignore = (?==*) ps lintTrackAllow $ \(FileQ x) -> ignore $ fileNameToString x -- | This rule builds the following files, in addition to any defined by its target. -- At the end of the rule these files must have been written. -- These files must /not/ be tracked as part of the build system - two rules cannot produce -- the same file and you cannot 'need' the files it produces. produces :: [FilePath] -> Action () produces xs = do producesChecked xs trackWrite xs -- | Require that the argument files are built by the rules, used to specify the target. -- -- @ -- main = 'Development.Shake.shake' 'shakeOptions' $ do -- 'want' [\"Main.exe\"] -- ... -- @ -- -- This program will build @Main.exe@, given sufficient rules. All arguments to all 'want' calls -- may be built in parallel, in any order. -- -- This function is defined in terms of 'action' and 'need', use 'action' if you need more complex -- targets than 'want' allows. want :: Partial => [FilePath] -> Rules () want [] = pure () want xs = withFrozenCallStack $ action $ need xs root :: String -> (FilePath -> Bool) -> (FilePath -> Action ()) -> Rules () root help test act = addUserRule $ FileRule help $ \x -> if not $ test x then Nothing else Just $ ModeDirect $ do liftIO $ createDirectoryRecursive $ takeDirectory x act x -- | Declare a Make-style phony action. A phony target does not name -- a file (despite living in the same namespace as file rules); -- rather, it names some action to be executed when explicitly -- requested. You can demand 'phony' rules using 'want'. (And 'need', -- although that's not recommended.) -- -- Phony actions are intended to define recipes that can be executed -- by the user. If you 'need' a phony action in a rule then every -- execution where that rule is required will rerun both the rule and -- the phony action. However, note that phony actions are never -- executed more than once in a single build run. -- -- In make, the @.PHONY@ attribute on non-file-producing rules has a -- similar effect. However, while in make it is acceptable to omit -- the @.PHONY@ attribute as long as you don't create the file in -- question, a Shake rule which behaves this way will fail lint. -- For file-producing rules which should be -- rerun every execution of Shake, see 'Development.Shake.alwaysRerun'. phony :: Located => String -> Action () -> Rules () phony oname@(toStandard -> name) act = do addTarget oname addPhony ("phony " ++ show oname ++ " at " ++ callStackTop) $ \s -> if s == name then Just act else Nothing -- | A predicate version of 'phony', return 'Just' with the 'Action' for the matching rules. phonys :: Located => (String -> Maybe (Action ())) -> Rules () phonys = addPhony ("phonys at " ++ callStackTop) -- | Infix operator alias for 'phony', for sake of consistency with normal -- rules. (~>) :: Located => String -> Action () -> Rules () (~>) oname@(toStandard -> name) act = do addTarget oname addPhony (show oname ++ " ~> at " ++ callStackTop) $ \s -> if s == name then Just act else Nothing addPhony :: String -> (String -> Maybe (Action ())) -> Rules () addPhony help act = addUserRule $ FileRule help $ fmap ModePhony . act -- | Define a rule to build files. If the first argument returns 'True' for a given file, -- the second argument will be used to build it. Usually '%>' is sufficient, but '?>' gives -- additional power. For any file used by the build system, only one rule should return 'True'. -- This function will create the directory for the result file, if necessary. -- -- @ -- (all isUpper . 'Development.Shake.FilePath.takeBaseName') '?>' \\out -> do -- let src = 'Development.Shake.FilePath.replaceBaseName' out $ map toLower $ takeBaseName out -- 'Development.Shake.writeFile'' out . map toUpper =<< 'Development.Shake.readFile'' src -- @ -- -- If the 'Action' completes successfully the file is considered up-to-date, even if the file -- has not changed. (?>) :: Located => (FilePath -> Bool) -> (FilePath -> Action ()) -> Rules () (?>) test act = priority 0.5 $ root ("?> at " ++ callStackTop) test act -- | Define a set of patterns, and if any of them match, run the associated rule. Defined in terms of '%>'. -- Think of it as the OR (@||@) equivalent of '%>'. (|%>) :: Located => [FilePattern] -> (FilePath -> Action ()) -> Rules () (|%>) pats act = do mapM_ addTarget pats let (simp,other) = partition simple pats case map toStandard simp of [] -> pure () [p] -> root help (\x -> toStandard x == p) act ps -> let set = Set.fromList ps in root help (flip Set.member set . toStandard) act unless (null other) $ let ps = map (?==) other in priority 0.5 $ root help (\x -> any ($ x) ps) act where help = show pats ++ " |%> at " ++ callStackTop -- | Define a rule that matches a 'FilePattern', see '?==' for the pattern rules. -- Patterns with no wildcards have higher priority than those with wildcards, and no file -- required by the system may be matched by more than one pattern at the same priority -- (see 'priority' and 'alternatives' to modify this behaviour). -- This function will create the directory for the result file, if necessary. -- -- @ -- \"*.asm.o\" '%>' \\out -> do -- let src = 'Development.Shake.FilePath.dropExtension' out -- 'need' [src] -- 'Development.Shake.cmd' \"as\" [src] \"-o\" [out] -- @ -- -- To define a build system for multiple compiled languages, we recommend using @.asm.o@, -- @.cpp.o@, @.hs.o@, to indicate which language produces an object file. -- I.e., the file @foo.cpp@ produces object file @foo.cpp.o@. -- -- Note that matching is case-sensitive, even on Windows. -- -- If the 'Action' completes successfully the file is considered up-to-date, even if the file -- has not changed. (%>) :: Located => FilePattern -> (FilePath -> Action ()) -> Rules () (%>) test act = withFrozenCallStack $ (if simple test then id else priority 0.5) $ do addTarget test root (show test ++ " %> at " ++ callStackTop) (test ?==) act shake-0.19.8/src/Development/Shake/Internal/Rules/Files.hs0000644000000000000000000002737107346545000021457 0ustar0000000000000000{-# LANGUAGE GeneralizedNewtypeDeriving, DeriveDataTypeable, ScopedTypeVariables #-} {-# LANGUAGE ViewPatterns, TypeFamilies, ConstraintKinds #-} module Development.Shake.Internal.Rules.Files( (&?>), (&%>), defaultRuleFiles ) where import Control.Monad import Control.Monad.IO.Class import Data.Maybe import Data.List.Extra import Data.Typeable import General.Binary import Development.Shake.Internal.Core.Action import Development.Shake.Internal.Core.Types hiding (Result) import Development.Shake.Internal.Core.Build import Development.Shake.Internal.Core.Rules import Development.Shake.Internal.Errors import General.Extra import Development.Shake.Internal.FileName import Development.Shake.Classes import Development.Shake.Internal.Rules.Rerun import Development.Shake.Internal.Rules.File import Development.Shake.Internal.FilePattern import Development.Shake.FilePath import Development.Shake.Internal.FileInfo import Development.Shake.Internal.Options import Data.Monoid import Prelude infix 1 &?>, &%> type instance RuleResult FilesQ = FilesA newtype FilesQ = FilesQ {fromFilesQ :: [FileQ]} deriving (Typeable,Eq,Hashable,Binary,BinaryEx,NFData) newtype FilesA = FilesA [FileA] deriving (Typeable,BinaryEx,NFData) instance Show FilesA where show (FilesA xs) = unwords $ "Files" : map (drop 5 . show) xs instance Show FilesQ where show (FilesQ xs) = unwords $ map (wrapQuote . show) xs data FilesRule = FilesRule String (FilesQ -> Maybe (Action FilesA)) deriving Typeable data Result = Result Ver FilesA instance BinaryEx Result where putEx (Result v x) = putExStorable v <> putEx x getEx s = let (a,b) = binarySplit s in Result a $ getEx b filesStoredValue :: ShakeOptions -> FilesQ -> IO (Maybe FilesA) filesStoredValue opts (FilesQ xs) = fmap FilesA . sequence <$> mapM (fileStoredValue opts) xs filesEqualValue :: ShakeOptions -> FilesA -> FilesA -> EqualCost filesEqualValue opts (FilesA xs) (FilesA ys) | length xs /= length ys = NotEqual | otherwise = foldr and_ EqualCheap $ zipWithExact (fileEqualValue opts) xs ys where and_ NotEqual _ = NotEqual and_ EqualCheap x = x and_ EqualExpensive x = if x == NotEqual then NotEqual else EqualExpensive defaultRuleFiles :: Rules () defaultRuleFiles = do opts <- getShakeOptionsRules -- A rule from FilesQ to FilesA. The result value is only useful for linting. addBuiltinRuleEx (ruleLint opts) (ruleIdentity opts) (ruleRun opts $ shakeRebuildApply opts) ruleLint :: ShakeOptions -> BuiltinLint FilesQ FilesA ruleLint _ _ (FilesA []) = pure Nothing -- in the case of disabling lint ruleLint opts k v = do now <- filesStoredValue opts k pure $ case now of Nothing -> Just "" Just now | filesEqualValue opts v now == EqualCheap -> Nothing | otherwise -> Just $ show now ruleIdentity :: ShakeOptions -> BuiltinIdentity FilesQ FilesA ruleIdentity opts | shakeChange opts == ChangeModtime = throwImpure $ errorStructured "Cannot use shakeChange=ChangeModTime with shakeShare" [] "" ruleIdentity _ = \_ (FilesA files) -> Just $ runBuilder $ putExList [putExStorable size <> putExStorable hash | FileA _ size hash <- files] ruleRun :: ShakeOptions -> (FilePath -> Rebuild) -> BuiltinRun FilesQ FilesA ruleRun opts rebuildFlags k o@(fmap getEx -> old :: Maybe Result) mode = do let r = map (rebuildFlags . fileNameToString . fromFileQ) $ fromFilesQ k (ruleVer, ruleAct, ruleErr) <- getUserRuleInternal k (\(FilesRule s _) -> Just s) $ \(FilesRule _ f) -> f k let verEq v = Just v == ruleVer || map (Ver . fst) ruleAct == [v] let rebuild = do putWhen Verbose $ "# " ++ show k case ruleAct of [x] -> rebuildWith x _ -> throwM ruleErr case old of _ | RebuildNow `elem` r -> rebuild _ | RebuildLater `elem` r -> case old of Just _ -> -- ignoring the currently stored value, which may trigger lint has changed -- so disable lint on this file pure $ RunResult ChangedNothing (fromJust o) $ FilesA [] Nothing -> do -- i don't have a previous value, so assume this is a source node, and mark rebuild in future now <- liftIO $ filesStoredValue opts k case now of Nothing -> rebuild Just now -> do alwaysRerun; pure $ RunResult ChangedStore (runBuilder $ putEx $ Result (Ver 0) now) now Just (Result ver old) | mode == RunDependenciesSame, verEq ver -> do v <- liftIO $ filesStoredValue opts k case v of Just v -> case filesEqualValue opts old v of NotEqual -> rebuild -- See #810, important we pass old (which can be cheaply evaluated) -- and not v, which might have some lazily-evaluated file hashes in EqualCheap -> pure $ RunResult ChangedNothing (fromJust o) old EqualExpensive -> pure $ RunResult ChangedStore (runBuilder $ putEx $ Result ver v) v Nothing -> rebuild _ -> rebuild where rebuildWith (ver, act) = do cache <- historyLoad ver v <- case cache of Just res -> fmap FilesA $ forM (zipExact (getExList res) (fromFilesQ k)) $ \(bin, file) -> do Just (FileA mod size _) <- liftIO $ fileStoredValue opts file pure $ FileA mod size $ getExStorable bin Nothing -> do FilesA v <- act producesUnchecked $ map (fileNameToString . fromFileQ) $ fromFilesQ k historySave ver $ runBuilder $ putExList [if isNoFileHash hash then throwImpure errorNoHash else putExStorable hash | FileA _ _ hash <- v] pure $ FilesA v let c | Just (Result _ old) <- old, filesEqualValue opts old v /= NotEqual = ChangedRecomputeSame | otherwise = ChangedRecomputeDiff pure $ RunResult c (runBuilder $ putEx $ Result (Ver ver) v) v -- | Define a rule for building multiple files at the same time. -- Think of it as the AND (@&&@) equivalent of '%>'. -- As an example, a single invocation of GHC produces both @.hi@ and @.o@ files: -- -- @ -- [\"*.o\",\"*.hi\"] '&%>' \\[o,hi] -> do -- let hs = o 'Development.Shake.FilePath.-<.>' \"hs\" -- 'Development.Shake.need' ... -- all files the .hs import -- 'Development.Shake.cmd' \"ghc -c\" [hs] -- @ -- -- However, in practice, it's usually easier to define rules with '%>' and make the @.hi@ depend -- on the @.o@. When defining rules that build multiple files, all the 'FilePattern' values must -- have the same sequence of @\/\/@ and @*@ wildcards in the same order. -- This function will create directories for the result files, if necessary. (&%>) :: Located => [FilePattern] -> ([FilePath] -> Action ()) -> Rules () [p] &%> act = withFrozenCallStack $ p %> act . pure ps &%> act | not $ compatible ps = error $ unlines $ "All patterns to &%> must have the same number and position of ** and * wildcards" : ["* " ++ p ++ (if compatible [p, headErr ps] then "" else " (incompatible)") | p <- ps] | otherwise = withFrozenCallStack $ do forM_ (zipFrom 0 ps) $ \(i,p) -> (if simple p then id else priority 0.5) $ fileForward (show ps ++ " &%> at " ++ callStackTop) $ let op = (p ?==) in \file -> if not $ op file then Nothing else Just $ do FilesA res <- apply1 $ FilesQ $ map (FileQ . fileNameFromString . substitute (extract p file)) ps pure $ if null res then Nothing else Just $ res !! i (if all simple ps then id else priority 0.5) $ do mapM_ addTarget ps addUserRule $ FilesRule (show ps ++ " &%> " ++ callStackTop) $ \(FilesQ xs_) -> let xs = map (fileNameToString . fromFileQ) xs_ in if not $ length xs == length ps && and (zipWithExact (?==) ps xs) then Nothing else Just $ do liftIO $ mapM_ createDirectoryRecursive $ nubOrd $ map takeDirectory xs trackAllow xs act xs getFileTimes "&%>" xs_ -- | Define a rule for building multiple files at the same time, a more powerful -- and more dangerous version of '&%>'. Think of it as the AND (@&&@) equivalent of '?>'. -- -- Given an application @test &?> ...@, @test@ should return @Just@ if the rule applies, and should -- return the list of files that will be produced. This list /must/ include the file passed as an argument and should -- obey the invariant: -- -- > forAll $ \x ys -> test x == Just ys ==> x `elem` ys && all ((== Just ys) . test) ys -- -- Intuitively, the function defines a set partitioning, mapping each element to the partition that contains it. -- As an example of a function satisfying the invariant: -- -- @ -- test x | 'Development.Shake.FilePath.takeExtension' x \`elem\` [\".hi\",\".o\"] -- = Just ['Development.Shake.FilePath.dropExtension' x 'Development.Shake.FilePath.<.>' \"hi\", 'Development.Shake.FilePath.dropExtension' x 'Development.Shake.FilePath.<.>' \"o\"] -- test _ = Nothing -- @ -- -- Regardless of whether @Foo.hi@ or @Foo.o@ is passed, the function always returns @[Foo.hi, Foo.o]@. (&?>) :: Located => (FilePath -> Maybe [FilePath]) -> ([FilePath] -> Action ()) -> Rules () (&?>) test act = priority 0.5 $ do let inputOutput suf inp out = ["Input" ++ suf ++ ":", " " ++ inp] ++ ["Output" ++ suf ++ ":"] ++ map (" "++) out let normTest = fmap (map $ toStandard . normaliseEx) . test let checkedTest x = case normTest x of Nothing -> Nothing Just ys | x `notElem` ys -> error $ unlines $ "Invariant broken in &?>, did not pure the input (after normalisation)." : inputOutput "" x ys Just ys | bad:_ <- filter ((/= Just ys) . normTest) ys -> error $ unlines $ ["Invariant broken in &?>, not equalValue for all arguments (after normalisation)."] ++ inputOutput "1" x ys ++ inputOutput "2" bad (fromMaybe ["Nothing"] $ normTest bad) Just ys -> Just ys fileForward ("&?> at " ++ callStackTop) $ \x -> case checkedTest x of Nothing -> Nothing Just ys -> Just $ do FilesA res <- apply1 $ FilesQ $ map (FileQ . fileNameFromString) ys pure $ if null res then Nothing else Just $ res !! fromJust (elemIndex x ys) addUserRule $ FilesRule ("&?> " ++ callStackTop) $ \(FilesQ xs_) -> let xs@(x:_) = map (fileNameToString . fromFileQ) xs_ in case checkedTest x of Just ys | ys == xs -> Just $ do liftIO $ mapM_ createDirectoryRecursive $ nubOrd $ map takeDirectory xs act xs getFileTimes "&?>" xs_ Just ys -> error $ "Error, &?> is incompatible with " ++ show xs ++ " vs " ++ show ys Nothing -> Nothing getFileTimes :: String -> [FileQ] -> Action FilesA getFileTimes name xs = do opts <- getShakeOptions let opts2 = if shakeChange opts == ChangeModtimeAndDigestInput then opts{shakeChange=ChangeModtime} else opts ys <- liftIO $ mapM (fileStoredValue opts2) xs case sequence ys of Just ys -> pure $ FilesA ys Nothing | not $ shakeCreationCheck opts -> pure $ FilesA [] Nothing -> do let missing = length $ filter isNothing ys error $ "Error, " ++ name ++ " rule failed to produce " ++ show missing ++ " file" ++ (if missing == 1 then "" else "s") ++ " (out of " ++ show (length xs) ++ ")" ++ concat ["\n " ++ fileNameToString x ++ if isNothing y then " - MISSING" else "" | (FileQ x,y) <- zipExact xs ys] shake-0.19.8/src/Development/Shake/Internal/Rules/Oracle.hs0000644000000000000000000001672607346545000021624 0ustar0000000000000000{-# LANGUAGE GeneralizedNewtypeDeriving, DeriveDataTypeable, ScopedTypeVariables #-} {-# LANGUAGE TypeFamilies, TypeOperators, ConstraintKinds #-} module Development.Shake.Internal.Rules.Oracle( addOracle, addOracleCache, addOracleHash, askOracle, askOracles ) where import Development.Shake.Internal.Core.Types import Development.Shake.Internal.Core.Rules import Development.Shake.Internal.Options import Development.Shake.Internal.Core.Build import Development.Shake.Internal.Value import Development.Shake.Classes import qualified Data.ByteString as BS import qualified Data.ByteString.Lazy as LBS import Control.Monad import Data.Binary import General.Binary import General.Extra -- Use short type names, since the names appear in the Haddock, and are too long if they are in full newtype OracleQ question = OracleQ question deriving (Show,Typeable,Eq,Hashable,Binary,NFData) newtype OracleA answer = OracleA answer deriving (Show,Typeable,Eq,Hashable,Binary,NFData) fromOracleA :: OracleA a -> a fromOracleA (OracleA x) = x type instance RuleResult (OracleQ a) = OracleA (RuleResult a) data Flavor = Norm | Cache | Hash deriving Eq addOracleFlavor :: (Located, RuleResult q ~ a, ShakeValue q, ShakeValue a) => Flavor -> (q -> Action a) -> Rules (q -> Action a) addOracleFlavor flavor act = do -- rebuild is automatic for oracles, skip just means we don't rebuild opts <- getShakeOptionsRules let skip = shakeRebuildApply opts "" == RebuildLater addBuiltinRule noLint (\_ v -> Just $ runBuilder $ putEx $ hash v) $ \(OracleQ q) old mode -> case old of Just old | (flavor /= Hash && skip) || (flavor == Cache && mode == RunDependenciesSame) -> pure $ RunResult ChangedNothing old $ decode' old _ -> do -- can only use cmpHash if flavor == Hash let cmpValue new = if fmap decode' old == Just new then ChangedRecomputeSame else ChangedRecomputeDiff let cmpHash newHash = if old == Just newHash then ChangedRecomputeSame else ChangedRecomputeDiff cache <- if flavor == Cache then historyLoad 0 else pure Nothing case cache of Just newEncode -> do let new = decode' newEncode pure $ RunResult (cmpValue new) newEncode new Nothing -> do new <- OracleA <$> act q let newHash = encodeHash new let newEncode = encode' new when (flavor == Cache) $ historySave 0 newEncode pure $ if flavor == Hash then RunResult (cmpHash newHash) newHash new else RunResult (cmpValue new) newEncode new pure askOracle where encodeHash :: Hashable a => a -> BS.ByteString encodeHash = runBuilder . putEx . hash encode' :: Binary a => a -> BS.ByteString encode' = BS.concat . LBS.toChunks . encode decode' :: Binary a => BS.ByteString -> a decode' = decode . LBS.fromChunks . pure -- | Add extra information which rules can depend on. -- An oracle is a function from a question type @q@, to an answer type @a@. -- As an example, we can define an oracle allowing you to depend on the current version of GHC: -- -- @ -- newtype GhcVersion = GhcVersion () deriving (Show,Typeable,Eq,Hashable,Binary,NFData) -- type instance RuleResult GhcVersion = String -- rules = do -- 'addOracle' $ \\(GhcVersion _) -> 'Development.Shake.fromStdout' \<$\> 'Development.Shake.cmd' \"ghc --numeric-version\" :: Action String -- ... rules ... -- @ -- -- If a rule calls @'askOracle' (GhcVersion ())@, that rule will be rerun whenever the GHC version changes. -- Some notes: -- -- * We define @GhcVersion@ with a @newtype@ around @()@, allowing the use of @GeneralizedNewtypeDeriving@. -- All the necessary type classes are exported from "Development.Shake.Classes". -- -- * The @type instance@ requires the extension @TypeFamilies@. -- -- * Each call to 'addOracle' must use a different type of question. -- -- * Actions passed to 'addOracle' will be run in every build they are required, even if nothing else changes, -- so be careful of slow actions. -- If the result of an oracle does not change it will not invalidate any rules depending on it. -- To always rerun files rules see 'Development.Shake.alwaysRerun'. -- -- As a more complex example, consider tracking Haskell package versions: -- -- @ -- newtype GhcPkgList = GhcPkgList () deriving (Show,Typeable,Eq,Hashable,Binary,NFData) -- type instance RuleResult GhcPkgList = [(String, String)] -- newtype GhcPkgVersion = GhcPkgVersion String deriving (Show,Typeable,Eq,Hashable,Binary,NFData) -- type instance RuleResult GhcPkgVersion = Maybe String -- -- rules = do -- getPkgList \<- 'addOracle' $ \\GhcPkgList{} -> do -- Stdout out <- 'Development.Shake.cmd' \"ghc-pkg list --simple-output\" -- pure [(reverse b, reverse a) | x <- words out, let (a,_:b) = break (== \'-\') $ reverse x] -- -- getPkgVersion \<- 'addOracle' $ \\(GhcPkgVersion pkg) -> do -- pkgs <- getPkgList $ GhcPkgList () -- pure $ lookup pkg pkgs -- -- \"myrule\" %> \\_ -> do -- getPkgVersion $ GhcPkgVersion \"shake\" -- ... rule using the shake version ... -- @ -- -- Using these definitions, any rule depending on the version of @shake@ -- should call @getPkgVersion $ GhcPkgVersion \"shake\"@ to rebuild when @shake@ is upgraded. -- -- If you apply 'versioned' to an oracle it will cause that oracle result to be discarded, and not do early-termination. addOracle :: (RuleResult q ~ a, ShakeValue q, ShakeValue a, Partial) => (q -> Action a) -> Rules (q -> Action a) addOracle = withFrozenCallStack $ addOracleFlavor Norm -- | An alternative to to 'addOracle' that relies on the 'hash' function providing a perfect equality, -- doesn't support @--skip@, but requires less storage. addOracleHash :: (RuleResult q ~ a, ShakeValue q, ShakeValue a, Partial) => (q -> Action a) -> Rules (q -> Action a) addOracleHash = withFrozenCallStack $ addOracleFlavor Hash -- | A combination of 'addOracle' and 'newCache' - an action that only runs when its dependencies change, -- whose result is stored in the database. -- -- * Does the information need recomputing every time? e.g. looking up stuff in the environment? -- If so, use 'addOracle' instead. -- -- * Is the action mostly deserisalising some file? If so, use 'newCache'. -- -- * Is the operation expensive computation from other results? If so, use 'addOracleCache'. -- -- An alternative to using 'addOracleCache' is introducing an intermediate file containing the result, -- which requires less storage in the Shake database and can be inspected by existing file-system viewing -- tools. addOracleCache ::(RuleResult q ~ a, ShakeValue q, ShakeValue a, Partial) => (q -> Action a) -> Rules (q -> Action a) addOracleCache = withFrozenCallStack $ addOracleFlavor Cache -- | Get information previously added with 'addOracle' or 'addOracleCache'. -- The question/answer types must match those provided previously. askOracle :: (RuleResult q ~ a, ShakeValue q, ShakeValue a) => q -> Action a askOracle = fmap fromOracleA . apply1 . OracleQ -- | A parallel version of 'askOracle'. askOracles :: (RuleResult q ~ a, ShakeValue q, ShakeValue a) => [q] -> Action [a] askOracles = fmap (map fromOracleA) . apply . map OracleQ shake-0.19.8/src/Development/Shake/Internal/Rules/OrderOnly.hs0000644000000000000000000000222307346545000022317 0ustar0000000000000000 module Development.Shake.Internal.Rules.OrderOnly( orderOnly, orderOnlyBS ) where import Development.Shake.Internal.Core.Types import Development.Shake.Internal.Core.Action import Development.Shake.Internal.Rules.File import qualified Data.ByteString.Char8 as BS -- | Define order-only dependencies, these are dependencies that will always -- be built before continuing, but which aren't dependencies of this action. -- Mostly useful for defining generated dependencies you think might be real dependencies. -- If they turn out to be real dependencies, you should add an explicit dependency afterwards. -- -- @ -- \"source.o\" %> \\out -> do -- 'orderOnly' [\"header.h\"] -- 'cmd_' \"gcc -c source.c -o source.o -MMD -MF source.m\" -- 'neededMakefileDependencies' \"source.m\" -- @ -- -- If @header.h@ is included by @source.c@ then the call to 'needMakefileDependencies' will cause -- it to be added as a real dependency. If it isn't, then the rule won't rebuild if it changes. orderOnly :: [FilePath] -> Action () orderOnly = orderOnlyAction . need orderOnlyBS :: [BS.ByteString] -> Action () orderOnlyBS = orderOnlyAction . needBS shake-0.19.8/src/Development/Shake/Internal/Rules/Rerun.hs0000644000000000000000000000303007346545000021472 0ustar0000000000000000{-# LANGUAGE GeneralizedNewtypeDeriving, DeriveDataTypeable, ScopedTypeVariables #-} {-# LANGUAGE TypeFamilies #-} module Development.Shake.Internal.Rules.Rerun( defaultRuleRerun, alwaysRerun ) where import Development.Shake.Internal.Core.Rules import Development.Shake.Internal.Core.Types import Development.Shake.Internal.Core.Build import Development.Shake.Internal.Core.Action import Development.Shake.Classes import qualified Data.ByteString as BS import General.Binary newtype AlwaysRerunQ = AlwaysRerunQ () deriving (Typeable,Eq,Hashable,Binary,BinaryEx,NFData) instance Show AlwaysRerunQ where show _ = "alwaysRerun" type instance RuleResult AlwaysRerunQ = () -- | Always rerun the associated action. Useful for defining rules that query -- the environment. For example: -- -- @ -- \"ghcVersion.txt\" 'Development.Shake.%>' \\out -> do -- 'alwaysRerun' -- 'Development.Shake.Stdout' stdout <- 'Development.Shake.cmd' \"ghc --numeric-version\" -- 'Development.Shake.writeFileChanged' out stdout -- @ -- -- In @make@, the @.PHONY@ attribute on file-producing rules has a similar effect. -- -- Note that 'alwaysRerun' is applied when a rule is executed. Modifying an existing rule -- to insert 'alwaysRerun' will /not/ cause that rule to rerun next time. alwaysRerun :: Action () alwaysRerun = do historyDisable apply1 $ AlwaysRerunQ () defaultRuleRerun :: Rules () defaultRuleRerun = addBuiltinRuleEx noLint noIdentity $ \AlwaysRerunQ{} _ _ -> pure $ RunResult ChangedRecomputeDiff BS.empty () shake-0.19.8/src/Development/Shake/Internal/Value.hs0000644000000000000000000000773707346545000020403 0ustar0000000000000000{-# LANGUAGE ExistentialQuantification, RecordWildCards, ScopedTypeVariables #-} {-# LANGUAGE ConstraintKinds #-} -- | This module implements the Key/Value types, to abstract over hetrogenous data types. module Development.Shake.Internal.Value( Value, newValue, fromValue, Key, newKey, fromKey, typeKey, ShakeValue ) where import Development.Shake.Classes import Development.Shake.Internal.Errors import Data.Typeable import Unsafe.Coerce -- | Define an alias for the six type classes required for things involved in Shake rules. -- Using this alias requires the @ConstraintKinds@ extension. -- -- To define your own values meeting the necessary constraints it is convenient to use the extensions -- @GeneralizedNewtypeDeriving@ and @DeriveDataTypeable@ to write: -- -- > newtype MyType = MyType (String, Bool) deriving (Show, Typeable, Eq, Hashable, Binary, NFData) -- -- Shake needs these instances on keys and values. They are used for: -- -- * 'Show' is used to print out keys in errors, profiling, progress messages -- and diagnostics. -- -- * 'Typeable' is used because Shake indexes its database by the -- type of the key and value involved in the rule (overlap is not -- allowed for type classes and not allowed in Shake either). -- -- * 'Eq' and 'Hashable' are used on keys in order to build hash maps -- from keys to values. 'Eq' is used on values to test if the value -- has changed or not (this is used to support unchanging rebuilds, -- where Shake can avoid rerunning rules if it runs a dependency, -- but it turns out that no changes occurred.) The 'Hashable' -- instances are only use at runtime (never serialised to disk), -- so they do not have to be stable across runs. -- Hashable on values is not used, and only required for a consistent interface. -- -- * 'Binary' is used to serialize keys and values into Shake's -- build database; this lets Shake cache values across runs and -- implement unchanging rebuilds. -- -- * 'NFData' is used to avoid space and thunk leaks, especially -- when Shake is parallelized. type ShakeValue a = (Show a, Typeable a, Eq a, Hashable a, Binary a, NFData a) -- We deliberately avoid Typeable instances on Key/Value to stop them accidentally -- being used inside themselves data Key = forall a . Key {keyType :: TypeRep ,keyShow :: a -> String ,keyRnf :: a -> () ,keyEq :: a -> a -> Bool ,keyHash :: Int -> a -> Int ,keyValue :: a } data Value = forall a . Value {valueType :: TypeRep ,valueShow :: a -> String ,valueRnf :: a -> () ,valueValue :: a } newKey :: forall a . ShakeValue a => a -> Key newKey = Key (typeRep (Proxy :: Proxy a)) show rnf (==) hashWithSalt newValue :: forall a . (Typeable a, Show a, NFData a) => a -> Value newValue = Value (typeRep (Proxy :: Proxy a)) show rnf typeKey :: Key -> TypeRep typeKey Key{..} = keyType fromKey :: forall a . Typeable a => Key -> a fromKey Key{..} | keyType == resType = unsafeCoerce keyValue | otherwise = throwImpure $ errorInternal $ "fromKey, bad cast, have " ++ show keyType ++ ", wanted " ++ show resType where resType = typeRep (Proxy :: Proxy a) fromValue :: forall a . Typeable a => Value -> a fromValue Value{..} | valueType == resType = unsafeCoerce valueValue | otherwise = throwImpure $ errorInternal $ "fromValue, bad cast, have " ++ show valueType ++ ", wanted " ++ show resType where resType = typeRep (Proxy :: Proxy a) instance Show Key where show Key{..} = keyShow keyValue instance Show Value where show Value{..} = valueShow valueValue instance NFData Key where rnf Key{..} = keyRnf keyValue instance NFData Value where rnf Value{..} = valueRnf valueValue instance Hashable Key where hash Key{..} = keyHash (hash keyType) keyValue hashWithSalt salt Key{..} = keyHash (hashWithSalt salt keyType) keyValue instance Eq Key where Key{keyType=at,keyValue=a,keyEq=eq} == Key{keyType=bt,keyValue=b} | at /= bt = False | otherwise = eq a (unsafeCoerce b) shake-0.19.8/src/Development/Shake/Rule.hs0000644000000000000000000001676107346545000016457 0ustar0000000000000000 -- | This module is used for defining new types of rules for Shake build systems, e.g. to support values stored in a database. -- Most users will find the built-in set of rules sufficient. The functions in this module are designed for high-performance, -- not ease of use or abstraction. As a result, they are difficult to work with and change more often than the other parts of Shake. -- Before writing a builtin rule you are encouraged to use 'Development.Shake.addOracle' or 'Development.Shake.addOracleCache' if possible. -- With all those warnings out the way, read on for the grungy details. module Development.Shake.Rule( -- * Builtin rules -- $builtin_rules -- ** Extensions -- $extensions -- ** Worked example -- $example -- * Defining builtin rules -- | Functions and types for defining new types of Shake rules. addBuiltinRule, BuiltinLint, noLint, BuiltinIdentity, noIdentity, BuiltinRun, RunMode(..), RunChanged(..), RunResult(..), -- * Calling builtin rules -- | Wrappers around calling Shake rules. In general these should be specialised to a builtin rule. apply, apply1, -- * User rules -- | Define user rules that can be used by builtin rules. -- Absent any builtin rule making use of a user rule at a given type, a user rule will have on effect - -- they have no inherent effect or interpretation on their own. addUserRule, getUserRuleList, getUserRuleMaybe, getUserRuleOne, -- * Lint integration -- | Provide lint warnings when running code. lintTrackRead, lintTrackWrite, lintTrackAllow, -- * History caching -- | Interact with the non-local cache. When using the cache it is important that all -- rules have accurate 'BuiltinIdentity' functions. historyIsEnabled, historySave, historyLoad ) where import Development.Shake.Internal.Core.Types import Development.Shake.Internal.Core.Action import Development.Shake.Internal.Core.Build import Development.Shake.Internal.Core.Rules -- $builtin_rules -- -- Shake \"Builtin\" rules are ones map keys to values - e.g. files to file contents. For each builtin rule you need to think: -- -- * What is the @key@ type, which uniquely identifies each location, e.g. a filename. -- -- * What is the @value@ type. The @value@ is not necessarily the full value, but is the result people can get if they ask -- for the value associated with the @key@. As an example, for files when you 'need' a file you don't get any value back from -- the file, so a simple file rule could have @()@ as its value. -- -- * What information is stored between runs. This information should be sufficient to check if the value has changed since last time, -- e.g. the modification time for files. -- -- Typically a custom rule will define a wrapper of type 'Rules' that calls 'addBuiltinRule', along with a type-safe wrapper over -- 'apply' so users can introduce dependencies. -- $extensions -- -- Once you have implemented the basic functionality there is more scope for embracing additional features of Shake, e.g.: -- -- * You can integrate with cached history by providing a working 'BuiltinIdentity' and using 'historySave' and 'historyLoad'. -- -- * You can let users provide their own rules which you interpret with 'addUserRule'. -- -- * You can integrate with linting by specifying a richer 'BuiltinLint' and options like 'lintTrackRead'. -- -- There are lots of rules defined in the Shake repo at . -- You are encouraged to read those for inspiration. -- $example -- -- Shake provides a very comprehensive file rule which currently runs to over 500 lines of code, and supports lots of features -- and optimisations. However, let's imagine we want to define a simpler rule type for files. As mentioned earlier, we have to make some decisions. -- -- * A @key@ will just be the file name. -- -- * A @value@ will be @()@ - when the user depends on a file they don't expect any information in return. -- -- * The stored information will be the contents of the file, in it's entirety. Alternative choices would be the modtime or a hash of the contents, -- but Shake doesn't require that. The stored information in Shake must be stored in a 'ByteString', so we 'Data.ByteString.pack' and -- 'Data.ByteString.unpack' to convert. -- -- * We will allow user rules to be defined saying how to build any individual file. -- -- First we define the type of key and value, deriving all the necessary type classes. We define a @newtype@ over 'FilePath' so we can -- guarantee not to conflict with anyone else. Typically you wouldn't export the @File@ type, providing only sugar functions over it. -- -- > newtype File = File FilePath -- > deriving (Show,Eq,Hashable,Binary,NFData) -- > type instance RuleResult File = () -- -- Since we have decided we are also going to have user rules, we need to define a new type to capture the information stored by the rules. -- We need to store at least the file it is producing and the action, which we do with: -- -- > data FileRule = FileRule File (Action ()) -- -- With the definitions above users could call 'apply' and 'addUserRule' directly, but that's tedious and not very type safe. To make it easier -- we introduce some helpers: -- -- > fileRule :: FilePath -> Action () -> Rules () -- > fileRule file act = addUserRule $ FileRule (File file) act -- > -- > fileNeed :: FilePath -> Action () -- > fileNeed = apply1 . File -- -- These helpers just add our type names, providing a more pleasant interface for the user. Using these function we can -- exercise our build system with: -- -- > example = do -- > fileRule "a.txt" $ pure () -- > fileRule "b.txt" $ do -- > fileNeed "a.txt" -- > liftIO $ writeFile "b.txt" . reverse =<< readFile "a.txt" -- > -- > action $ fileNeed "b.txt" -- -- This example defines rules for @a.txt@ (a source file) and @b.txt@ (the 'reverse' of @a.txt@). At runtime this example will -- complain about not having a builtin rule for @File@, so the only thing left is to provide one. -- -- > addBuiltinFileRule :: Rules () -- > addBuiltinFileRule = addBuiltinRule noLint noIdentity run -- > where -- > fileContents (File x) = do b <- IO.doesFileExist x; if b then IO.readFile' x else pure "" -- > -- > run :: BuiltinRun File () -- > run key old mode = do -- > now <- liftIO $ fileContents key -- > if mode == RunDependenciesSame && fmap BS.unpack old == Just now then -- > pure $ RunResult ChangedNothing (BS.pack now) () -- > else do -- > (_, act) <- getUserRuleOne key (const Nothing) $ \(FileRule k act) -> if k == key then Just act else Nothing -- > act -- > now <- liftIO $ fileContents key -- > pure $ RunResult ChangedRecomputeDiff (BS.pack now) () -- -- We define a wrapper @addBuiltinFileRule@ that calls @addBuiltinRule@, opting out of linting and cached storage. -- The only thing we provide is a 'BuiltinRun' function which gets the previous state, and whether any dependency has changed, -- and decides whether to rebuild. If something has changed we call 'getUserRuleOne' to find the users rule and rerun it. -- The 'RunResult' says what changed (either 'ChangedNothing' or 'ChangedRecomputeDiff' in our cases), gives us a new stored value -- (just packing the contents) and the @value@ which is @()@. -- -- To execute our example we need to also call @addBuiltinFileRule@, and now everything works. shake-0.19.8/src/Development/Shake/Util.hs0000644000000000000000000001017307346545000016454 0ustar0000000000000000 -- | A module for useful utility functions for Shake build systems. module Development.Shake.Util( parseMakefile, needMakefileDependencies, neededMakefileDependencies, shakeArgsAccumulate, shakeArgsPrune, shakeArgsPruneWith, ) where import Development.Shake import Development.Shake.Internal.Rules.File import qualified Data.ByteString.Char8 as BS import qualified General.Makefile as BS import Data.Tuple.Extra import Data.List import General.GetOpt import Data.IORef import Data.Maybe import Control.Monad.Extra import System.IO.Extra as IO -- | Given the text of a Makefile, extract the list of targets and dependencies. Assumes a -- small subset of Makefile syntax, mostly that generated by @gcc -MM@. -- -- > parseMakefile "a: b c\nd : e" == [("a",["b","c"]),("d",["e"])] parseMakefile :: String -> [(FilePath, [FilePath])] parseMakefile = map (BS.unpack *** map BS.unpack) . BS.parseMakefile . BS.pack -- | Depend on the dependencies listed in a Makefile. Does not depend on the Makefile itself. -- -- > needMakefileDependencies file = need . concatMap snd . parseMakefile =<< liftIO (readFile file) needMakefileDependencies :: FilePath -> Action () needMakefileDependencies file = needBS . concatMap snd . BS.parseMakefile =<< liftIO (BS.readFile file) -- | Depend on the dependencies listed in a Makefile. Does not depend on the Makefile itself. -- Use this function to indicate that you have /already/ used the files in question. -- -- > neededMakefileDependencies file = needed . concatMap snd . parseMakefile =<< liftIO (readFile file) neededMakefileDependencies :: FilePath -> Action () neededMakefileDependencies file = neededBS . concatMap snd . BS.parseMakefile =<< liftIO (BS.readFile file) -- | Like `shakeArgsWith`, but instead of accumulating a list of flags, apply functions to a default value. -- Usually used to populate a record structure. As an example of a build system that can use either @gcc@ or @distcc@ for compiling: -- -- @ -- import System.Console.GetOpt -- -- data Flags = Flags {distCC :: Bool} deriving Eq -- flags = [Option \"\" [\"distcc\"] (NoArg $ Right $ \\x -> x{distCC=True}) \"Run distributed.\"] -- -- main = 'shakeArgsAccumulate' 'shakeOptions' flags (Flags False) $ \\flags targets -> pure $ Just $ do -- if null targets then 'want' [\"result.exe\"] else 'want' targets -- let compiler = if distCC flags then \"distcc\" else \"gcc\" -- \"*.o\" '%>' \\out -> do -- 'need' ... -- 'cmd' compiler ... -- ... -- @ -- -- Now you can pass @--distcc@ to use the @distcc@ compiler. shakeArgsAccumulate :: ShakeOptions -> [OptDescr (Either String (a -> a))] -> a -> (a -> [String] -> IO (Maybe (Rules ()))) -> IO () shakeArgsAccumulate opts flags def f = shakeArgsWith opts flags $ \flags targets -> f (foldl' (flip ($)) def flags) targets -- | Like 'shakeArgs' but also takes a pruning function. If @--prune@ is passed, then after the build has completed, -- the second argument is called with a list of the files that the build checked were up-to-date. shakeArgsPrune :: ShakeOptions -> ([FilePath] -> IO ()) -> Rules () -> IO () shakeArgsPrune opts prune rules = shakeArgsPruneWith opts prune [] f where f _ files = pure $ Just $ if null files then rules else want files >> withoutActions rules -- | A version of 'shakeArgsPrune' that also takes a list of extra options to use. shakeArgsPruneWith :: ShakeOptions -> ([FilePath] -> IO ()) -> [OptDescr (Either String a)] -> ([a] -> [String] -> IO (Maybe (Rules ()))) -> IO () shakeArgsPruneWith opts prune flags act = do let flags2 = Option "P" ["prune"] (NoArg $ Right Nothing) "Remove stale files" : map (fmapFmapOptDescr Just) flags pruning <- newIORef False shakeArgsWith opts flags2 $ \opts args -> case sequence opts of Nothing -> do writeIORef pruning True pure Nothing Just opts -> act opts args whenM (readIORef pruning) $ IO.withTempFile $ \file -> do shakeArgsWith opts{shakeLiveFiles=file : shakeLiveFiles opts} flags2 $ \opts args -> act (catMaybes opts) args src <- lines <$> IO.readFile' file prune src shake-0.19.8/src/General/0000755000000000000000000000000007346545000013241 5ustar0000000000000000shake-0.19.8/src/General/Bilist.hs0000644000000000000000000000166407346545000015032 0ustar0000000000000000 -- | List type that supports O(1) amortized 'cons', 'snoc', 'uncons' and 'isEmpty'. module General.Bilist( Bilist, cons, snoc, uncons, toList, isEmpty ) where import Data.Semigroup import Prelude data Bilist a = Bilist [a] [a] toList :: Bilist a -> [a] toList (Bilist as bs) = as ++ reverse bs isEmpty :: Bilist a -> Bool isEmpty (Bilist as bs) = null as && null bs instance Eq a => Eq (Bilist a) where a == b = toList a == toList b instance Semigroup (Bilist a) where a <> b = Bilist (toList a ++ toList b) [] instance Monoid (Bilist a) where mempty = Bilist [] [] mappend = (<>) cons :: a -> Bilist a -> Bilist a cons x (Bilist as bs) = Bilist (x:as) bs snoc :: Bilist a -> a -> Bilist a snoc (Bilist as bs) x = Bilist as (x:bs) uncons :: Bilist a -> Maybe (a, Bilist a) uncons (Bilist [] []) = Nothing uncons (Bilist (a:as) bs) = Just (a, Bilist as bs) uncons (Bilist [] bs) = uncons $ Bilist (reverse bs) [] shake-0.19.8/src/General/Binary.hs0000644000000000000000000002027407346545000015026 0ustar0000000000000000{-# LANGUAGE FlexibleInstances, ScopedTypeVariables, Rank2Types #-} module General.Binary( BinaryOp(..), binaryOpMap, binarySplit, binarySplit2, binarySplit3, unsafeBinarySplit, Builder(..), runBuilder, sizeBuilder, BinaryEx(..), Storable, putExStorable, getExStorable, putExStorableList, getExStorableList, putExList, getExList, putExN, getExN ) where import Development.Shake.Classes import Control.Monad import Data.Binary import Data.List.Extra import Data.Tuple.Extra import Foreign.Marshal.Utils import Foreign.Storable import Foreign.Ptr import System.IO.Unsafe as U import qualified Data.ByteString as BS import qualified Data.ByteString.Internal as BS import qualified Data.ByteString.Unsafe as BS import qualified Data.ByteString.Lazy as LBS import qualified Data.ByteString.UTF8 as UTF8 import Data.Semigroup import Prelude --------------------------------------------------------------------- -- STORE TYPE -- | An explicit and more efficient version of Binary data BinaryOp v = BinaryOp {putOp :: v -> Builder ,getOp :: BS.ByteString -> v } binaryOpMap :: BinaryEx a => (a -> BinaryOp b) -> BinaryOp (a, b) binaryOpMap mp = BinaryOp {putOp = \(a, b) -> putExN (putEx a) <> putOp (mp a) b ,getOp = \bs -> let (bs1,bs2) = getExN bs; a = getEx bs1 in (a, getOp (mp a) bs2) } binarySplit :: forall a . Storable a => BS.ByteString -> (a, BS.ByteString) binarySplit bs | BS.length bs < sizeOf (undefined :: a) = error "Reading from ByteString, insufficient left" | otherwise = unsafeBinarySplit bs binarySplit2 :: forall a b . (Storable a, Storable b) => BS.ByteString -> (a, b, BS.ByteString) binarySplit2 bs | BS.length bs < sizeOf (undefined :: a) + sizeOf (undefined :: b) = error "Reading from ByteString, insufficient left" | (a,bs) <- unsafeBinarySplit bs, (b,bs) <- unsafeBinarySplit bs = (a,b,bs) binarySplit3 :: forall a b c . (Storable a, Storable b, Storable c) => BS.ByteString -> (a, b, c, BS.ByteString) binarySplit3 bs | BS.length bs < sizeOf (undefined :: a) + sizeOf (undefined :: b) + sizeOf (undefined :: c) = error "Reading from ByteString, insufficient left" | (a,bs) <- unsafeBinarySplit bs, (b,bs) <- unsafeBinarySplit bs, (c,bs) <- unsafeBinarySplit bs = (a,b,c,bs) unsafeBinarySplit :: Storable a => BS.ByteString -> (a, BS.ByteString) unsafeBinarySplit bs = (v, BS.unsafeDrop (sizeOf v) bs) where v = unsafePerformIO $ BS.unsafeUseAsCString bs $ \ptr -> peek (castPtr ptr) -- forM for zipWith for2M_ as bs f = zipWithM_ f as bs --------------------------------------------------------------------- -- BINARY SERIALISATION -- We can't use the Data.ByteString builder as that doesn't track the size of the chunk. data Builder = Builder {-# UNPACK #-} !Int (forall a . Ptr a -> Int -> IO ()) sizeBuilder :: Builder -> Int sizeBuilder (Builder i _) = i runBuilder :: Builder -> BS.ByteString runBuilder (Builder i f) = unsafePerformIO $ BS.create i $ \ptr -> f ptr 0 instance Semigroup Builder where (Builder x1 x2) <> (Builder y1 y2) = Builder (x1+y1) $ \p i -> do x2 p i; y2 p $ i+x1 instance Monoid Builder where mempty = Builder 0 $ \_ _ -> pure () mappend = (<>) -- | Methods for Binary serialisation that go directly between strict ByteString values. -- When the Database is read each key/value will be loaded as a separate ByteString, -- and for certain types (e.g. file rules) this may remain the preferred format for storing keys. -- Optimised for performance. class BinaryEx a where putEx :: a -> Builder getEx :: BS.ByteString -> a instance BinaryEx BS.ByteString where putEx x = Builder n $ \ptr i -> BS.unsafeUseAsCString x $ \bs -> copyBytes (ptr `plusPtr` i) (castPtr bs) (fromIntegral n) where n = BS.length x getEx = id instance BinaryEx LBS.ByteString where putEx x = Builder (fromIntegral $ LBS.length x) $ \ptr i -> do let go _ [] = pure () go i (x:xs) = do let n = BS.length x BS.unsafeUseAsCString x $ \bs -> copyBytes (ptr `plusPtr` i) (castPtr bs) (fromIntegral n) go (i+n) xs go i $ LBS.toChunks x getEx = LBS.fromChunks . pure instance BinaryEx [BS.ByteString] where -- Format: -- n :: Word32 - number of strings -- ns :: [Word32]{n} - length of each string -- contents of each string concatenated (sum ns bytes) putEx xs = Builder (4 + (n * 4) + sum ns) $ \p i -> do pokeByteOff p i (fromIntegral n :: Word32) for2M_ [4+i,8+i..] ns $ \i x -> pokeByteOff p i (fromIntegral x :: Word32) p<- pure $ p `plusPtr` (i + 4 + (n * 4)) for2M_ (scanl (+) 0 ns) xs $ \i x -> BS.unsafeUseAsCStringLen x $ \(bs, n) -> copyBytes (p `plusPtr` i) (castPtr bs) (fromIntegral n) where ns = map BS.length xs n = length ns getEx bs = unsafePerformIO $ BS.unsafeUseAsCString bs $ \p -> do n <- fromIntegral <$> (peekByteOff p 0 :: IO Word32) ns :: [Word32] <- forM [1..fromIntegral n] $ \i -> peekByteOff p (i * 4) pure $ snd $ mapAccumL (\bs i -> swap $ BS.splitAt (fromIntegral i) bs) (BS.drop (4 + (n * 4)) bs) ns instance BinaryEx () where putEx () = mempty getEx _ = () instance BinaryEx String where putEx = putEx . UTF8.fromString getEx = UTF8.toString instance BinaryEx (Maybe String) where putEx Nothing = mempty putEx (Just xs) = putEx $ UTF8.fromString $ '\0' : xs getEx = fmap snd . uncons . UTF8.toString instance BinaryEx [String] where putEx = putEx . map UTF8.fromString getEx = map UTF8.toString . getEx instance BinaryEx (String, [String]) where putEx (a,bs) = putEx $ a:bs getEx x = let a:bs = getEx x in (a,bs) instance BinaryEx Bool where putEx False = Builder 1 $ \ptr i -> pokeByteOff ptr i (0 :: Word8) putEx True = mempty getEx = BS.null instance BinaryEx Word8 where putEx = putExStorable getEx = getExStorable instance BinaryEx Word16 where putEx = putExStorable getEx = getExStorable instance BinaryEx Word32 where putEx = putExStorable getEx = getExStorable instance BinaryEx Int where putEx = putExStorable getEx = getExStorable instance BinaryEx Float where putEx = putExStorable getEx = getExStorable putExStorable :: forall a . Storable a => a -> Builder putExStorable x = Builder (sizeOf x) $ \p i -> pokeByteOff p i x getExStorable :: forall a . Storable a => BS.ByteString -> a getExStorable = \bs -> unsafePerformIO $ BS.unsafeUseAsCStringLen bs $ \(p, size) -> if size /= n then error "size mismatch" else peek (castPtr p) where n = sizeOf (undefined :: a) putExStorableList :: forall a . Storable a => [a] -> Builder putExStorableList xs = Builder (n * length xs) $ \ptr i -> for2M_ [i,i+n..] xs $ \i x -> pokeByteOff ptr i x where n = sizeOf (undefined :: a) getExStorableList :: forall a . Storable a => BS.ByteString -> [a] getExStorableList = \bs -> unsafePerformIO $ BS.unsafeUseAsCStringLen bs $ \(p, size) -> let (d,m) = size `divMod` n in if m /= 0 then error "size mismatch" else forM [0..d-1] $ \i -> peekElemOff (castPtr p) i where n = sizeOf (undefined :: a) -- repeating: -- Word32, length of BS -- BS putExList :: [Builder] -> Builder putExList xs = Builder (sum $ map (\b -> sizeBuilder b + 4) xs) $ \p i -> do let go _ [] = pure () go i (Builder n b:xs) = do pokeByteOff p i (fromIntegral n :: Word32) b p (i+4) go (i+4+n) xs go i xs getExList :: BS.ByteString -> [BS.ByteString] getExList bs | len == 0 = [] | len >= 4 , (n :: Word32, bs) <- unsafeBinarySplit bs , n <- fromIntegral n , (len - 4) >= n = BS.unsafeTake n bs : getExList (BS.unsafeDrop n bs) | otherwise = error "getList, corrupted binary" where len = BS.length bs putExN :: Builder -> Builder putExN (Builder n old) = Builder (n+4) $ \p i -> do pokeByteOff p i (fromIntegral n :: Word32) old p $ i+4 getExN :: BS.ByteString -> (BS.ByteString, BS.ByteString) getExN bs | len >= 4 , (n :: Word32, bs) <- unsafeBinarySplit bs , n <- fromIntegral n , (len - 4) >= n = (BS.unsafeTake n bs, BS.unsafeDrop n bs) | otherwise = error "getList, corrupted binary" where len = BS.length bs shake-0.19.8/src/General/Chunks.hs0000644000000000000000000001225207346545000015032 0ustar0000000000000000{-# LANGUAGE RecordWildCards, ScopedTypeVariables #-} module General.Chunks( Chunks, readChunk, readChunkMax, usingWriteChunks, writeChunk, restoreChunksBackup, usingChunks, resetChunksCompact, resetChunksCorrupt ) where import System.Time.Extra import System.FilePath import Control.Concurrent.Extra import Control.Monad.Extra import Control.Exception import System.IO import System.Directory import qualified Data.ByteString as BS import Data.Word import Data.Monoid import General.Binary import General.Extra import General.Cleanup import General.Thread import Prelude data Chunks = Chunks {chunksFileName :: FilePath ,chunksFlush :: Maybe Seconds ,chunksHandle :: MVar Handle } --------------------------------------------------------------------- -- READ/WRITE OPERATIONS readChunk :: Chunks -> IO (Either BS.ByteString BS.ByteString) readChunk c = readChunkMax c maxBound -- | Return either a valid chunk (Right), or a trailing suffix with no information (Left) readChunkMax :: Chunks -> Word32 -> IO (Either BS.ByteString BS.ByteString) readChunkMax Chunks{..} mx = withMVar chunksHandle $ \h -> readChunkDirect h mx readChunkDirect :: Handle -> Word32 -> IO (Either BS.ByteString BS.ByteString) readChunkDirect h mx = do let slop x = do unless (BS.null x) $ hSetFileSize h . subtract (toInteger $ BS.length x) =<< hFileSize h pure $ Left x n <- BS.hGet h 4 if BS.length n < 4 then slop n else do let count = fromIntegral $ min mx $ fst $ unsafeBinarySplit n v <- BS.hGet h count if BS.length v < count then slop (n `BS.append` v) else pure $ Right v writeChunkDirect :: Handle -> Builder -> IO () writeChunkDirect h x = bs `seq` BS.hPut h bs where bs = runBuilder $ putEx (fromIntegral $ sizeBuilder x :: Word32) <> x -- | If 'writeChunks' and any of the reopen operations are interleaved it will cause issues. usingWriteChunks :: Cleanup -> Chunks -> IO (Builder -> IO ()) -- We avoid calling flush too often on SSD drives, as that can be slow -- Make sure all exceptions happen on the caller, so we don't have to move exceptions back -- Make sure we only write on one thread, otherwise async exceptions can cause partial writes usingWriteChunks cleanup Chunks{..} = do h <- allocate cleanup (takeMVar chunksHandle) (putMVar chunksHandle) chan <- newChan -- operations to perform on the file kick <- newEmptyMVar -- kicked whenever something is written died <- newBarrier -- has the writing thread finished whenJust chunksFlush $ \flush -> allocateThread cleanup $ forever $ do takeMVar kick sleep flush tryTakeMVar kick writeChan chan $ hFlush h >> pure True -- pump the thread while we are running -- once we abort, let everything finish flushing first -- the mask_ is very important - we don't want to abort until everything finishes allocateThread cleanup $ mask_ $ whileM $ join $ readChan chan -- this cleanup will run before we attempt to kill the thread register cleanup $ writeChan chan $ pure False pure $ \s -> do out <- evaluate $ writeChunkDirect h s -- ensure exceptions occur on this thread writeChan chan $ out >> tryPutMVar kick () >> pure True writeChunk :: Chunks -> Builder -> IO () writeChunk Chunks{..} x = withMVar chunksHandle $ \h -> writeChunkDirect h x --------------------------------------------------------------------- -- FILENAME OPERATIONS backup x = x <.> "backup" restoreChunksBackup :: FilePath -> IO Bool restoreChunksBackup file = do -- complete a partially failed compress b <- doesFileExist $ backup file if not b then pure False else do removeFile_ file renameFile (backup file) file pure True usingChunks :: Cleanup -> FilePath -> Maybe Seconds -> IO Chunks usingChunks cleanup file flush = do h <- newEmptyMVar allocate cleanup (putMVar h =<< openFile file ReadWriteMode) (const $ hClose =<< takeMVar h) pure $ Chunks file flush h -- | The file is being compacted, if the process fails, use a backup. resetChunksCompact :: Chunks -> ((Builder -> IO ()) -> IO a) -> IO a resetChunksCompact Chunks{..} act = mask $ \restore -> do h <- takeMVar chunksHandle flip onException (putMVar chunksHandle h) $ restore $ do hClose h copyFile chunksFileName $ backup chunksFileName h <- openFile chunksFileName ReadWriteMode flip finally (putMVar chunksHandle h) $ restore $ do hSetFileSize h 0 hSeek h AbsoluteSeek 0 res <- act $ writeChunkDirect h hFlush h removeFile $ backup chunksFileName pure res -- | The file got corrupted, return a new version. resetChunksCorrupt :: Maybe FilePath -> Chunks -> IO () resetChunksCorrupt copy Chunks{..} = mask $ \restore -> do h <- takeMVar chunksHandle h <- case copy of Nothing -> pure h Just copy -> do flip onException (putMVar chunksHandle h) $ restore $ do hClose h copyFile chunksFileName copy openFile chunksFileName ReadWriteMode flip finally (putMVar chunksHandle h) $ do hSetFileSize h 0 hSeek h AbsoluteSeek 0 shake-0.19.8/src/General/Cleanup.hs0000644000000000000000000000424707346545000015173 0ustar0000000000000000 -- | Code for ensuring cleanup actions are run. module General.Cleanup( Cleanup, newCleanup, withCleanup, register, release, allocate, unprotect ) where import Control.Exception import qualified Data.HashMap.Strict as Map import Data.IORef import Data.List.Extra import Data.Maybe data S = S {unique :: {-# UNPACK #-} !Int -- next index to be used to items ,items :: !(Map.HashMap Int (IO ())) } newtype Cleanup = Cleanup (IORef S) data ReleaseKey = ReleaseKey (IORef S) {-# UNPACK #-} !Int -- | Run with some cleanup scope. Regardless of exceptions/threads, all 'register' actions -- will be run by the time it exits. -- The 'register' actions will be run in reverse order, i.e. the last to be added will be run first. withCleanup :: (Cleanup -> IO a) -> IO a withCleanup act = do (c, clean) <- newCleanup act c `finally` clean newCleanup :: IO (Cleanup, IO ()) newCleanup = do ref <- newIORef $ S 0 Map.empty -- important to use uninterruptibleMask_ otherwise in things like allocateThread -- we might end up being interrupted and failing to close down the thread -- e.g. see https://github.com/digital-asset/ghcide/issues/381 -- note that packages like safe-exceptions also use uninterruptibleMask_ let clean = uninterruptibleMask_ $ do items <- atomicModifyIORef' ref $ \s -> (s{items=Map.empty}, items s) mapM_ snd $ sortOn (negate . fst) $ Map.toList items pure (Cleanup ref, clean) register :: Cleanup -> IO () -> IO ReleaseKey register (Cleanup ref) act = atomicModifyIORef' ref $ \s -> let i = unique s in (S (unique s + 1) (Map.insert i act $ items s), ReleaseKey ref i) unprotect :: ReleaseKey -> IO () unprotect (ReleaseKey ref i) = atomicModifyIORef' ref $ \s -> (s{items = Map.delete i $ items s}, ()) release :: ReleaseKey -> IO () release (ReleaseKey ref i) = uninterruptibleMask_ $ do undo <- atomicModifyIORef' ref $ \s -> (s{items = Map.delete i $ items s}, Map.lookup i $ items s) fromMaybe (pure ()) undo allocate :: Cleanup -> IO a -> (a -> IO ()) -> IO a allocate cleanup acquire release = mask_ $ do v <- acquire register cleanup $ release v pure v shake-0.19.8/src/General/EscCodes.hs0000644000000000000000000000525007346545000015267 0ustar0000000000000000{-# LANGUAGE CPP #-} -- | Working with escape sequences module General.EscCodes( Color(..), checkEscCodes, removeEscCodes, escWindowTitle, escCursorUp, escClearLine, escForeground, escNormal ) where import Data.Char import Data.List.Extra import System.IO import System.Environment import System.IO.Unsafe #ifdef mingw32_HOST_OS import Data.Word import Data.Bits import Foreign.Ptr import Foreign.Storable import Foreign.Marshal.Alloc #endif checkEscCodes :: IO Bool checkEscCodes = pure checkEscCodesOnce {-# NOINLINE checkEscCodesOnce #-} checkEscCodesOnce :: Bool checkEscCodesOnce = unsafePerformIO $ do hdl <- hIsTerminalDevice stdout env <- maybe False (/= "dumb") <$> lookupEnv "TERM" if hdl && env then pure True else #ifdef mingw32_HOST_OS checkEscCodesWindows #else pure False #endif #ifdef mingw32_HOST_OS #ifdef x86_64_HOST_ARCH #define CALLCONV ccall #else #define CALLCONV stdcall #endif foreign import CALLCONV unsafe "Windows.h GetStdHandle" c_GetStdHandle :: Word32 -> IO (Ptr ()) foreign import CALLCONV unsafe "Windows.h GetConsoleMode" c_GetConsoleModule :: Ptr () -> Ptr Word32 -> IO Bool foreign import CALLCONV unsafe "Windows.h SetConsoleMode" c_SetConsoleMode :: Ptr () -> Word32 -> IO Bool c_STD_OUTPUT_HANDLE = 4294967285 :: Word32 -- (-11) for some reason c_ENABLE_VIRTUAL_TERMINAL_PROCESSING = 0x0004 :: Word32 -- | Try and get the handle attributes, if they are all satisifed, return True. -- If they aren't, try and set it to emulated mode. checkEscCodesWindows :: IO Bool checkEscCodesWindows = do h <- c_GetStdHandle c_STD_OUTPUT_HANDLE -- might return INVALID_HANDLE_VALUE, but then the next step will happily fail mode <- alloca $ \v -> do b <- c_GetConsoleModule h v if b then Just <$> peek v else pure Nothing case mode of Nothing -> pure False Just mode -> do let modeNew = mode .|. c_ENABLE_VIRTUAL_TERMINAL_PROCESSING if mode == modeNew then pure True else do c_SetConsoleMode h modeNew #endif removeEscCodes :: String -> String removeEscCodes ('\ESC':'[':xs) = removeEscCodes $ drop1 $ dropWhile (not . isAlpha) xs removeEscCodes (x:xs) = x : removeEscCodes xs removeEscCodes [] = [] escWindowTitle :: String -> String escWindowTitle x = "\ESC]0;" ++ x ++ "\BEL" escCursorUp :: Int -> String escCursorUp i = "\ESC[" ++ show i ++ "A" escClearLine :: String escClearLine = "\ESC[K" data Color = Black | Red | Green | Yellow | Blue | Magenta | Cyan | White deriving (Show,Enum) escForeground :: Color -> String escForeground x = "\ESC[" ++ show (30 + fromEnum x) ++ "m" escNormal :: String escNormal = "\ESC[0m" shake-0.19.8/src/General/Extra.hs0000644000000000000000000002402407346545000014662 0ustar0000000000000000{-# LANGUAGE ScopedTypeVariables, ConstraintKinds, GeneralizedNewtypeDeriving, ViewPatterns #-} {-# OPTIONS_GHC -Wno-x-partial -Wno-unrecognised-warning-flags #-} module General.Extra( getProcessorCount, findGcc, whenLeft, randomElem, wrapQuote, showBracket, withs, forNothingM, maximum', maximumBy', unconcat, fastAt, headErr, tailErr, zipExact, zipWithExact, isAsyncException, showDurationSecs, usingLineBuffering, doesFileExist_, doesDirectoryExist_, usingNumCapabilities, removeFile_, createDirectoryRecursive, catchIO, tryIO, handleIO, handleSynchronous, Located, Partial, callStackTop, callStackFull, withFrozenCallStack, callStackFromException, Ver(..), makeVer, QTypeRep(..), NoShow(..) ) where import Control.Exception.Extra import Data.Char import Data.List.Extra import System.Environment import Development.Shake.FilePath import Control.DeepSeq import General.Cleanup import Data.Typeable import System.IO.Error import System.IO.Extra import System.Time.Extra import System.IO.Unsafe import System.Info.Extra import System.Random import System.Directory import System.Exit import Numeric.Extra import Foreign.Storable import Control.Concurrent.Extra import Data.Maybe import Data.Hashable import Data.Primitive.Array import Control.Monad import Control.Monad.ST import GHC.Conc(getNumProcessors) import GHC.Stack --------------------------------------------------------------------- -- Prelude -- See https://ghc.haskell.org/trac/ghc/ticket/10830 - they broke maximumBy maximumBy' :: (a -> a -> Ordering) -> [a] -> a maximumBy' cmp = foldl1' $ \x y -> if cmp x y == GT then x else y maximum' :: Ord a => [a] -> a maximum' = maximumBy' compare newtype NoShow a = NoShow a instance Show (NoShow a) where show _ = "NoShow" unconcat :: [[a]] -> [b] -> [[b]] unconcat [] _ = [] unconcat (a:as) bs = b1 : unconcat as b2 where (b1,b2) = splitAt (length a) bs headErr :: [a] -> a headErr = head tailErr :: [a] -> [a] tailErr = tail --------------------------------------------------------------------- -- Data.List -- | If a string has any spaces then put quotes around and double up all internal quotes. -- Roughly the inverse of Windows command line parsing. wrapQuote :: String -> String wrapQuote xs | any isSpace xs = "\"" ++ concatMap (\x -> if x == '\"' then "\"\"" else [x]) xs ++ "\"" | otherwise = xs -- | If a string has any spaces then put brackets around it. wrapBracket :: String -> String wrapBracket xs | any isSpace xs = "(" ++ xs ++ ")" | otherwise = xs -- | Alias for @wrapBracket . show@. showBracket :: Show a => a -> String showBracket = wrapBracket . show -- | Version of '!!' which is fast and returns 'Nothing' if the index is not present. fastAt :: [a] -> (Int -> Maybe a) fastAt xs = \i -> if i < 0 || i >= n then Nothing else Just $ indexArray arr i where n = length xs arr = runST $ do let n = length xs arr <- newArray n undefined forM_ (zipFrom 0 xs) $ \(i,x) -> writeArray arr i x unsafeFreezeArray arr zipWithExact :: Partial => (a -> b -> c) -> [a] -> [b] -> [c] zipWithExact f = g where g [] [] = [] g (a:as) (b:bs) = f a b : g as bs g _ _ = error "zipWithExacts: unequal lengths" zipExact :: Partial => [a] -> [b] -> [(a,b)] zipExact = zipWithExact (,) --------------------------------------------------------------------- -- System.Info {-# NOINLINE getProcessorCount #-} getProcessorCount :: IO Int -- unsafePefromIO so we cache the result and only compute it once getProcessorCount = let res = unsafePerformIO act in pure res where act = if rtsSupportsBoundThreads then fromIntegral <$> getNumProcessors else do env <- lookupEnv "NUMBER_OF_PROCESSORS" case env of Just s | [(i,"")] <- reads s -> pure i _ -> do src <- readFile' "/proc/cpuinfo" `catchIO` \_ -> pure "" pure $! max 1 $ length [() | x <- lines src, "processor" `isPrefixOf` x] -- Can you find a GCC executable? return a Bool, and optionally something to add to $PATH to run it findGcc :: IO (Bool, Maybe FilePath) findGcc = do v <- findExecutable "gcc" case v of Nothing | isWindows -> do ghc <- findExecutable "ghc" case ghc of Just ghc -> do let gcc = takeDirectory (takeDirectory ghc) "mingw/bin/gcc.exe" b <- doesFileExist_ gcc pure $ if b then (True, Just $ takeDirectory gcc) else (False, Nothing) _ -> pure (False, Nothing) _ -> pure (isJust v, Nothing) --------------------------------------------------------------------- -- System.Random randomElem :: [a] -> IO a randomElem xs = do when (null xs) $ fail "General.Extra.randomElem called with empty list, can't pick a random element" i <- randomRIO (0, length xs - 1) pure $ xs !! i --------------------------------------------------------------------- -- System.IO usingLineBuffering :: Cleanup -> IO () usingLineBuffering cleanup = do out <- hGetBuffering stdout err <- hGetBuffering stderr when (out /= LineBuffering || err /= LineBuffering) $ do register cleanup $ hSetBuffering stdout out >> hSetBuffering stderr err hSetBuffering stdout LineBuffering >> hSetBuffering stderr LineBuffering --------------------------------------------------------------------- -- System.Time showDurationSecs :: Seconds -> String showDurationSecs = replace ".00s" "s" . showDuration . intToDouble . round --------------------------------------------------------------------- -- Control.Monad withs :: [(a -> r) -> r] -> ([a] -> r) -> r withs [] act = act [] withs (f:fs) act = f $ \a -> withs fs $ \as -> act $ a:as forNothingM :: Monad m => [a] -> (a -> m (Maybe b)) -> m (Maybe [b]) forNothingM [] f = pure $ Just [] forNothingM (x:xs) f = do v <- f x case v of Nothing -> pure Nothing Just v -> liftM (v:) `liftM` forNothingM xs f --------------------------------------------------------------------- -- Control.Concurrent usingNumCapabilities :: Cleanup -> Int -> IO () usingNumCapabilities cleanup new = when rtsSupportsBoundThreads $ do old <- getNumCapabilities when (old /= new) $ do register cleanup $ setNumCapabilities old setNumCapabilities new --------------------------------------------------------------------- -- Control.Exception -- | Is the exception asynchronous, not a "coding error" that should be ignored isAsyncException :: SomeException -> Bool isAsyncException e | Just (_ :: AsyncException) <- fromException e = True | Just (_ :: ExitCode) <- fromException e = True | otherwise = False catchIO :: IO a -> (IOException -> IO a) -> IO a catchIO = catch tryIO :: IO a -> IO (Either IOException a) tryIO = try handleIO :: (IOException -> IO a) -> IO a -> IO a handleIO = handle handleSynchronous :: (SomeException -> IO a) -> IO a -> IO a handleSynchronous = handleBool (not . isAsyncException) --------------------------------------------------------------------- -- System.Directory doesFileExist_ :: FilePath -> IO Bool doesFileExist_ x = doesFileExist x `catchIO` \_ -> pure False doesDirectoryExist_ :: FilePath -> IO Bool doesDirectoryExist_ x = doesDirectoryExist x `catchIO` \_ -> pure False -- | Remove a file, but don't worry if it fails removeFile_ :: FilePath -> IO () removeFile_ x = removeFile x `catchIO` \e -> when (isPermissionError e) $ handleIO (\_ -> pure ()) $ do perms <- getPermissions x setPermissions x perms{readable = True, searchable = True, writable = True} removeFile x -- | Like @createDirectoryIfMissing True@ but faster, as it avoids -- any work in the common case the directory already exists. createDirectoryRecursive :: FilePath -> IO () createDirectoryRecursive dir = do x <- tryIO $ doesDirectoryExist dir when (x /= Right True) $ createDirectoryIfMissing True dir --------------------------------------------------------------------- -- Data.Either whenLeft :: Applicative m => Either a b -> (a -> m ()) -> m () whenLeft x f = either f (const $ pure ()) x --------------------------------------------------------------------- -- Data.CallStack type Located = Partial callStackTop :: Partial => String callStackTop = withFrozenCallStack $ headDef "unknown location" callStackFull callStackFull :: Partial => [String] callStackFromException :: SomeException -> ([String], SomeException) -- | Invert 'prettyCallStack', which GHC pre-applies in certain cases parseCallStack = reverse . map trimStart . drop1 . lines callStackFull = parseCallStack $ prettyCallStack $ popCallStack callStack callStackFromException (fromException -> Just (ErrorCallWithLocation msg loc)) = (parseCallStack loc, toException $ ErrorCall msg) callStackFromException e = ([], e) --------------------------------------------------------------------- -- Data.Version -- | A version number that indicates change, not ordering or compatibility. -- Always presented as an 'Int' to the user, but a newtype inside the library for safety. newtype Ver = Ver Int deriving (Show,Eq,Storable) makeVer :: String -> Ver makeVer = Ver . hash --------------------------------------------------------------------- -- Data.Typeable -- | Like TypeRep, but the Show includes enough information to be unique -- so I can rely on @a == b === show a == show b@. newtype QTypeRep = QTypeRep {fromQTypeRep :: TypeRep} deriving (Eq,Hashable,NFData) instance Show QTypeRep where -- Need to show enough so that different types with the same names don't clash -- But can't show too much or the history is not portable https://github.com/ndmitchell/shake/issues/670 show (QTypeRep x) = f x where f x = ['(' | xs /= []] ++ (unwords $ g c : map f xs) ++ [')' | xs /= []] where (c, xs) = splitTyConApp x g x = tyConModule x ++ "." ++ tyConName x shake-0.19.8/src/General/Fence.hs0000644000000000000000000000361307346545000014620 0ustar0000000000000000{-# LANGUAGE LambdaCase #-} module General.Fence( Fence, newFence, signalFence, waitFence, testFence, exceptFence ) where import Control.Monad import Control.Monad.IO.Class import Control.Exception.Extra import Development.Shake.Internal.Errors import Data.Maybe import Data.Either.Extra import Data.IORef --------------------------------------------------------------------- -- FENCE -- | Like a barrier, but based on callbacks newtype Fence m a = Fence (IORef (Either (a -> m ()) a)) instance Show (Fence m a) where show _ = "Fence" newFence :: MonadIO m => IO (Fence m a) newFence = Fence <$> newIORef (Left $ const $ pure ()) signalFence :: (Partial, MonadIO m) => Fence m a -> a -> m () signalFence (Fence ref) v = join $ liftIO $ atomicModifyIORef' ref $ \case Left queue -> (Right v, queue v) Right _ -> throwImpure $ errorInternal "signalFence called twice on one Fence" waitFence :: MonadIO m => Fence m a -> (a -> m ()) -> m () waitFence (Fence ref) call = join $ liftIO $ atomicModifyIORef' ref $ \case Left queue -> (Left (\a -> queue a >> call a), pure ()) Right v -> (Right v, call v) testFence :: Fence m a -> IO (Maybe a) testFence (Fence x) = eitherToMaybe <$> readIORef x --------------------------------------------------------------------- -- FENCE COMPOSITES exceptFence :: MonadIO m => [Fence m (Either e r)] -> m (Fence m (Either e [r])) exceptFence xs = do -- number of items still to complete, becomes negative after it has triggered todo <- liftIO $ newIORef $ length xs fence <- liftIO newFence forM_ xs $ \x -> waitFence x $ \res -> join $ liftIO $ atomicModifyIORef' todo $ \i -> case res of Left e | i >= 0 -> (-1, signalFence fence $ Left e) _ | i == 1 -> (-1, signalFence fence . Right =<< liftIO (mapM (fmap (fromRight' . fromJust) . testFence) xs)) | otherwise -> (i-1, pure ()) pure fence shake-0.19.8/src/General/FileLock.hs0000644000000000000000000000511507346545000015267 0ustar0000000000000000{-# LANGUAGE CPP #-} module General.FileLock(usingLockFile) where import Control.Exception.Extra import System.FilePath import General.Extra import General.Cleanup #ifdef mingw32_HOST_OS import Control.Monad import Data.Bits import Data.Word import Foreign.Ptr import Foreign.C.String #else import System.IO import System.Posix.IO #endif #ifdef mingw32_HOST_OS #ifdef x86_64_HOST_ARCH #define CALLCONV ccall #else #define CALLCONV stdcall #endif foreign import CALLCONV unsafe "Windows.h CreateFileW" c_CreateFileW :: CWString -> Word32 -> Word32 -> Ptr () -> Word32 -> Word32 -> Ptr () -> IO (Ptr ()) foreign import CALLCONV unsafe "Windows.h CloseHandle" c_CloseHandle :: Ptr () -> IO Bool foreign import CALLCONV unsafe "Windows.h GetLastError" c_GetLastError :: IO Word32 c_GENERIC_WRITE = 0x40000000 :: Word32 c_GENERIC_READ = 0x80000000 :: Word32 c_FILE_SHARE_NONE = 0 :: Word32 c_OPEN_ALWAYS = 4 :: Word32 c_FILE_ATTRIBUTE_NORMAL = 0x80 :: Word32 c_INVALID_HANDLE_VALUE = intPtrToPtr (-1) c_ERROR_SHARING_VIOLATION = 32 #endif usingLockFile :: Cleanup -> FilePath -> IO () #ifdef mingw32_HOST_OS usingLockFile b file = do createDirectoryRecursive $ takeDirectory file let open = withCWString file $ \cfile -> c_CreateFileW cfile (c_GENERIC_READ .|. c_GENERIC_WRITE) c_FILE_SHARE_NONE nullPtr c_OPEN_ALWAYS c_FILE_ATTRIBUTE_NORMAL nullPtr h <- allocate b open (void . c_CloseHandle) when (h == c_INVALID_HANDLE_VALUE) $ do err <- c_GetLastError errorIO $ "Shake failed to acquire a file lock on " ++ file ++ "\n" ++ (if err == c_ERROR_SHARING_VIOLATION then "ERROR_SHARING_VIOLATION - Shake is probably already running." else "Code " ++ show err ++ ", unknown reason for failure.") #else usingLockFile cleanup file = do createDirectoryRecursive $ takeDirectory file tryIO $ writeFile file "" fd <- allocate cleanup (openSimpleFd file ReadWrite) closeFd let lock = (WriteLock, AbsoluteSeek, 0, 0) setLock fd lock `catchIO` \e -> do res <- getLock fd lock errorIO $ "Shake failed to acquire a file lock on " ++ file ++ "\n" ++ (case res of Nothing -> "" Just (pid, _) -> "Shake process ID " ++ show pid ++ " is using this lock.\n") ++ show e #ifndef MIN_VERSION_unix #define MIN_VERSION_unix(a,b,c) 0 #endif #if MIN_VERSION_unix(2,8,0) openSimpleFd file mode = openFd file mode defaultFileFlags #else openSimpleFd file mode = openFd file mode Nothing defaultFileFlags #endif #endif shake-0.19.8/src/General/GetOpt.hs0000644000000000000000000000471507346545000015006 0ustar0000000000000000 module General.GetOpt( OptDescr(..), ArgDescr(..), getOpt, fmapFmapOptDescr, showOptDescr, mergeOptDescr, removeOverlap, optionsEnum, optionsEnumDesc ) where import qualified System.Console.GetOpt as O import System.Console.GetOpt hiding (getOpt) import qualified Data.HashSet as Set import Data.Maybe import Data.Either import Data.List.Extra getOpt :: [OptDescr (Either String a)] -> [String] -> ([a], [String], [String]) getOpt opts args = (flagGood, files, flagBad ++ errs) where (flags, files, errs) = O.getOpt O.Permute opts args (flagBad, flagGood) = partitionEithers flags fmapFmapOptDescr :: (a -> b) -> OptDescr (Either String a) -> OptDescr (Either String b) fmapFmapOptDescr f = fmap (fmap f) showOptDescr :: [OptDescr a] -> [String] showOptDescr xs = concat [ if nargs <= 26 then [" " ++ args ++ replicate (28 - nargs) ' ' ++ desc] else [" " ++ args, replicate 30 ' ' ++ desc] | Option s l arg desc <- xs , let args = intercalate ", " $ map (short arg) s ++ map (long arg) l , let nargs = length args] where short NoArg{} x = "-" ++ [x] short (ReqArg _ b) x = "-" ++ [x] ++ " " ++ b short (OptArg _ b) x = "-" ++ [x] ++ "[" ++ b ++ "]" long NoArg{} x = "--" ++ x long (ReqArg _ b) x = "--" ++ x ++ "=" ++ b long (OptArg _ b) x = "--" ++ x ++ "[=" ++ b ++ "]" -- | Remove flags from the first field that are present in the second removeOverlap :: [OptDescr b] -> [OptDescr a] -> [OptDescr a] removeOverlap bad = mapMaybe f where short = Set.fromList $ concat [x | Option x _ _ _ <- bad] long = Set.fromList $ concat [x | Option _ x _ _ <- bad] f (Option a b c d) | null a2 && null b2 = Nothing | otherwise = Just $ Option a2 b2 c d where a2 = filter (not . flip Set.member short) a b2 = filter (not . flip Set.member long) b mergeOptDescr :: [OptDescr (Either String a)] -> [OptDescr (Either String b)] -> [OptDescr (Either String (Either a b))] mergeOptDescr xs ys = map (fmapFmapOptDescr Left) xs ++ map (fmapFmapOptDescr Right) ys optionsEnum :: (Enum a, Bounded a, Show a) => [OptDescr (Either String a)] optionsEnum = optionsEnumDesc [(x, "Flag " ++ lower (show x) ++ ".") | x <- enumerate] optionsEnumDesc :: Show a => [(a, String)] -> [OptDescr (Either String a)] optionsEnumDesc xs = [Option "" [lower $ show x] (NoArg $ Right x) d | (x,d) <- xs] shake-0.19.8/src/General/Ids.hs0000644000000000000000000001036007346545000014314 0ustar0000000000000000{-# LANGUAGE RecordWildCards, BangPatterns, GADTs, UnboxedTuples #-} -- Note that argument order is more like IORef than Map, because its mutable module General.Ids( Ids, Id(..), empty, insert, lookup, fromList, null, size, sizeUpperBound, forWithKeyM_, forCopy, forMutate, toList, elems, toMap ) where import Data.IORef.Extra import Data.Primitive.Array hiding (fromList) import Control.Exception import General.Intern(Id(..)) import Control.Monad.Extra import Data.List.Extra(zipFrom) import Data.Maybe import Data.Functor import qualified Data.HashMap.Strict as Map import Prelude hiding (lookup, null) import GHC.IO(IO(..)) import GHC.Exts(RealWorld) newtype Ids a = Ids (IORef (S a)) data S a = S {capacity :: {-# UNPACK #-} !Int -- ^ Number of entries in values, initially 0 ,used :: {-# UNPACK #-} !Int -- ^ Capacity that has been used, assuming no gaps from index 0, initially 0 ,values :: {-# UNPACK #-} !(MutableArray RealWorld (Maybe a)) } empty :: IO (Ids a) empty = do let capacity = 0 let used = 0 values <- newArray capacity Nothing Ids <$> newIORef S{..} fromList :: [a] -> IO (Ids a) fromList xs = do let capacity = length xs let used = capacity values <- newArray capacity Nothing forM_ (zipFrom 0 xs) $ \(i, x) -> writeArray values i $ Just x Ids <$> newIORef S{..} sizeUpperBound :: Ids a -> IO Int sizeUpperBound (Ids ref) = do S{..} <- readIORef ref pure used size :: Ids a -> IO Int size (Ids ref) = do S{..} <- readIORef ref let go !acc i | i < 0 = pure acc | otherwise = do v <- readArray values i if isJust v then go (acc+1) (i-1) else go acc (i-1) go 0 (used-1) toMap :: Ids a -> IO (Map.HashMap Id a) toMap ids = do mp <- Map.fromList <$> toListUnsafe ids pure $! mp forWithKeyM_ :: Ids a -> (Id -> a -> IO ()) -> IO () forWithKeyM_ (Ids ref) f = do S{..} <- readIORef ref let go !i | i >= used = pure () | otherwise = do v <- readArray values i whenJust v $ f $ Id $ fromIntegral i go $ i+1 go 0 forCopy :: Ids a -> (a -> b) -> IO (Ids b) forCopy (Ids ref) f = do S{..} <- readIORef ref values2 <- newArray capacity Nothing let go !i | i >= used = pure () | otherwise = do v <- readArray values i whenJust v $ \v -> writeArray values2 i $ Just $ f v go $ i+1 go 0 Ids <$> newIORef (S capacity used values2) forMutate :: Ids a -> (a -> a) -> IO () forMutate (Ids ref) f = do S{..} <- readIORef ref let go !i | i >= used = pure () | otherwise = do v <- readArray values i whenJust v $ \v -> writeArray values i $! Just $! f v go $ i+1 go 0 toListUnsafe :: Ids a -> IO [(Id, a)] toListUnsafe (Ids ref) = do S{..} <- readIORef ref -- execute in O(1) stack -- see https://neilmitchell.blogspot.co.uk/2015/09/making-sequencemapm-for-io-take-o1-stack.html let index _ i | i >= used = [] index r i | IO io <- readArray values i = case io r of (# r, Nothing #) -> index r (i+1) (# r, Just v #) -> (Id $ fromIntegral i, v) : index r (i+1) IO $ \r -> (# r, index r 0 #) toList :: Ids a -> IO [(Id, a)] toList ids = do xs <- toListUnsafe ids let demand (_:xs) = demand xs demand [] = () evaluate $ demand xs pure xs elems :: Ids a -> IO [a] elems ids = map snd <$> toList ids null :: Ids a -> IO Bool null ids = (== 0) <$> sizeUpperBound ids insert :: Ids a -> Id -> a -> IO () insert (Ids ref) (Id i) v = do S{..} <- readIORef ref let ii = fromIntegral i if ii < capacity then do writeArray values ii $ Just v when (ii >= used) $ writeIORef' ref S{used=ii+1,..} else do c2<- pure $ max (capacity * 2) (ii + 10000) v2 <- newArray c2 Nothing copyMutableArray v2 0 values 0 capacity writeArray v2 ii $ Just v writeIORef' ref $ S c2 (ii+1) v2 lookup :: Ids a -> Id -> IO (Maybe a) lookup (Ids ref) (Id i) = do S{..} <- readIORef ref let ii = fromIntegral i if ii < used then readArray values ii else pure Nothing shake-0.19.8/src/General/Intern.hs0000644000000000000000000000223107346545000015032 0ustar0000000000000000{-# LANGUAGE GeneralizedNewtypeDeriving #-} module General.Intern( Intern, Id(..), empty, insert, add, lookup, toList, fromList ) where import Development.Shake.Classes import Foreign.Storable import Data.Word import Prelude hiding (lookup) import qualified Data.HashMap.Strict as Map import Data.List(foldl') -- Invariant: The first field is the highest value in the Map data Intern a = Intern {-# UNPACK #-} !Word32 !(Map.HashMap a Id) newtype Id = Id Word32 deriving (Eq,Hashable,Ord,Binary,Show,NFData,Storable) empty :: Intern a empty = Intern 0 Map.empty insert :: (Eq a, Hashable a) => a -> Id -> Intern a -> Intern a insert k v@(Id i) (Intern n mp) = Intern (max n i) $ Map.insert k v mp add :: (Eq a, Hashable a) => a -> Intern a -> (Intern a, Id) add k (Intern v mp) = (Intern v2 $ Map.insert k (Id v2) mp, Id v2) where v2 = v + 1 lookup :: (Eq a, Hashable a) => a -> Intern a -> Maybe Id lookup k (Intern _ mp) = Map.lookup k mp toList :: Intern a -> [(a, Id)] toList (Intern _ mp) = Map.toList mp fromList :: (Eq a, Hashable a) => [(a, Id)] -> Intern a fromList xs = Intern (foldl' max 0 [i | (_, Id i) <- xs]) (Map.fromList xs) shake-0.19.8/src/General/ListBuilder.hs0000644000000000000000000000233307346545000016020 0ustar0000000000000000{-# LANGUAGE DeriveFunctor, GeneralizedNewtypeDeriving #-} module General.ListBuilder( ListBuilder, runListBuilder, newListBuilder, Tree(..), flattenTree, unflattenTree ) where import Data.Semigroup import Prelude -- ListBuilder is opaque outside this module newtype ListBuilder a = ListBuilder (Tree a) deriving (Semigroup, Monoid, Functor) data Tree a = Empty | Leaf a | Branch (Tree a) (Tree a) deriving (Functor,Eq,Ord,Show) instance Semigroup (Tree a) where Empty <> x = x x <> Empty = x x <> y = Branch x y instance Monoid (Tree a) where mempty = Empty mappend = (<>) flattenTree :: Tree a -> [a] flattenTree x = f x [] where f Empty acc = acc f (Leaf x) acc = x : acc f (Branch x y) acc = f x (f y acc) unflattenTree :: Tree a -> [b] -> Tree b unflattenTree t xs = fst $ f t xs where f Empty xs = (Empty, xs) f Leaf{} (x:xs) = (Leaf x, xs) f (Branch a b) xs = (Branch a2 b2, xs3) where (a2, xs2) = f a xs (b2, xs3) = f b xs2 newListBuilder :: a -> ListBuilder a newListBuilder = ListBuilder . Leaf runListBuilder :: ListBuilder a -> [a] runListBuilder (ListBuilder x) = flattenTree x shake-0.19.8/src/General/Makefile.hs0000644000000000000000000000254007346545000015313 0ustar0000000000000000 module General.Makefile(parseMakefile) where import qualified Data.ByteString.Char8 as BS import Data.Char endsSlash :: BS.ByteString -> Bool endsSlash = BS.isSuffixOf (BS.singleton '\\') wordsMakefile :: BS.ByteString -> [BS.ByteString] wordsMakefile = f . BS.splitWith isSpace where f (x:xs) | BS.null x = f xs f (x:y:xs) | endsSlash x = f $ BS.concat [BS.init x, BS.singleton ' ', y] : xs f (x:xs) = x : f xs f [] = [] parseMakefile :: BS.ByteString -> [(BS.ByteString, [BS.ByteString])] parseMakefile = concatMap f . join . linesCR where join xs = case span endsSlash xs of ([], []) -> [] (xs, []) -> [BS.unwords $ map BS.init xs] ([], y:ys) -> y : join ys (xs, y:ys) -> BS.unwords (map BS.init xs ++ [y]) : join ys f x = [(a, wordsMakefile $ BS.drop 1 b) | a <- wordsMakefile a] where (a,b) = BS.break (== ':') $ BS.takeWhile (/= '#') x -- | This is a hot-spot, so optimised linesCR :: BS.ByteString -> [BS.ByteString] linesCR x = case BS.split '\n' x of x:xs | Just ('\r',x) <- unsnoc x -> x : map (\x -> case unsnoc x of Just ('\r',x) -> x; _ -> x) xs xs -> xs where -- the ByteString unsnoc was introduced in a newer version unsnoc x | BS.null x = Nothing | otherwise = Just (BS.last x, BS.init x) shake-0.19.8/src/General/Pool.hs0000644000000000000000000001534707346545000014520 0ustar0000000000000000{-# LANGUAGE TupleSections #-} -- | Thread pool implementation. The three names correspond to the following -- priority levels (highest to lowest): -- -- * 'addPoolException' - things that probably result in a build error, -- so kick them off quickly. -- -- * 'addPoolResume' - things that started, blocked, and may have open -- resources in their closure. -- -- * 'addPoolStart' - rules that haven't yet started. -- -- * 'addPoolBatch' - rules that might batch if other rules start first. module General.Pool( Pool, runPool, addPool, PoolPriority(..), increasePool, keepAlivePool ) where import Control.Concurrent.Extra import System.Time.Extra import Control.Exception import Control.Monad.Extra import General.Timing import General.Thread import qualified Data.Heap as Heap import qualified Data.HashSet as Set import Data.IORef.Extra import System.Random --------------------------------------------------------------------- -- THREAD POOL {- Must keep a list of active threads, so can raise exceptions in a timely manner If any worker throws an exception, must signal to all the other workers -} data S = S {alive :: !Bool -- True until there's an exception, after which don't spawn more tasks ,threads :: !(Set.HashSet Thread) -- IMPORTANT: Must be strict or we leak thread stacks ,threadsLimit :: {-# UNPACK #-} !Int -- user supplied thread limit, Set.size threads <= threadsLimit ,threadsCount :: {-# UNPACK #-} !Int -- Set.size threads, but in O(1) ,threadsMax :: {-# UNPACK #-} !Int -- high water mark of Set.size threads (accounting only) ,threadsSum :: {-# UNPACK #-} !Int -- number of threads we have been through (accounting only) ,rand :: IO Int -- operation to give us the next random Int ,todo :: !(Heap.Heap (Heap.Entry (PoolPriority, Int) (IO ()))) -- operations waiting a thread } emptyS :: Int -> Bool -> IO S emptyS n deterministic = do rand <- if not deterministic then pure randomIO else do ref <- newIORef 0 -- no need to be thread-safe - if two threads race they were basically the same time anyway pure $ do i <- readIORef ref; writeIORef' ref (i+1); pure i pure $ S True Set.empty n 0 0 0 rand Heap.empty data Pool = Pool !(Var S) -- Current state, 'alive' = False to say we are aborting !(Barrier (Either SomeException S)) -- Barrier to signal that we are finished withPool :: Pool -> (S -> IO (S, IO ())) -> IO () withPool (Pool var _) f = join $ modifyVar var $ \s -> if alive s then f s else pure (s, pure ()) withPool_ :: Pool -> (S -> IO S) -> IO () withPool_ pool act = withPool pool $ fmap (, pure()) . act worker :: Pool -> IO () worker pool = withPool pool $ \s -> pure $ case Heap.uncons $ todo s of Nothing -> (s, pure ()) Just (Heap.Entry _ now, todo2) -> (s{todo = todo2}, now >> worker pool) -- | Given a pool, and a function that breaks the S invariants, restore them. -- They are only allowed to touch threadsLimit or todo. -- Assumes only requires spawning a most one job (e.g. can't increase the pool by more than one at a time) step :: Pool -> (S -> IO S) -> IO () -- mask_ is so we don't spawn and not record it step pool@(Pool _ done) op = mask_ $ withPool_ pool $ \s -> do s <- op s case Heap.uncons $ todo s of Just (Heap.Entry _ now, todo2) | threadsCount s < threadsLimit s -> do -- spawn a new worker t <- newThreadFinally (now >> worker pool) $ \t res -> case res of Left e -> withPool_ pool $ \s -> do signalBarrier done $ Left e pure (remThread t s){alive = False} Right _ -> step pool $ pure . remThread t pure (addThread t s){todo = todo2} Nothing | threadsCount s == 0 -> do signalBarrier done $ Right s pure s{alive = False} _ -> pure s where addThread t s = s{threads = Set.insert t $ threads s, threadsCount = threadsCount s + 1 ,threadsSum = threadsSum s + 1, threadsMax = threadsMax s `max` (threadsCount s + 1)} remThread t s = s{threads = Set.delete t $ threads s, threadsCount = threadsCount s - 1} -- | Add a new task to the pool. See the top of the module for the relative ordering -- and semantics. addPool :: PoolPriority -> Pool -> IO a -> IO () addPool priority pool act = step pool $ \s -> do i <- rand s pure s{todo = Heap.insert (Heap.Entry (priority, i) $ void act) $ todo s} data PoolPriority = PoolException | PoolResume | PoolStart | PoolBatch | PoolDeprioritize Double deriving (Eq,Ord) -- | Temporarily increase the pool by 1 thread. Call the cleanup action to restore the value. -- After calling cleanup you should requeue onto a new thread. increasePool :: Pool -> IO (IO ()) increasePool pool = do step pool $ \s -> pure s{threadsLimit = threadsLimit s + 1} pure $ step pool $ \s -> pure s{threadsLimit = threadsLimit s - 1} -- | Make sure the pool cannot run out of tasks (and thus everything finishes) until after the cancel is called. -- Ensures that a pool that will requeue in time doesn't go idle. keepAlivePool :: Pool -> IO (IO ()) keepAlivePool pool = do bar <- newBarrier addPool PoolResume pool $ do cancel <- increasePool pool waitBarrier bar cancel pure $ signalBarrier bar () -- | Run all the tasks in the pool on the given number of works. -- If any thread throws an exception, the exception will be reraised. runPool :: Bool -> Int -> (Pool -> IO ()) -> IO () -- run all tasks in the pool runPool deterministic n act = do s <- newVar =<< emptyS n deterministic done <- newBarrier let pool = Pool s done -- if someone kills our thread, make sure we kill our child threads let cleanup = join $ modifyVar s $ \s -> pure (s{alive=False}, stopThreads $ Set.toList $ threads s) let ghc10793 = do -- if this thread dies because it is blocked on an MVar there's a chance we have -- a better error in the done barrier, and GHC raised the exception wrongly, see: -- https://ghc.haskell.org/trac/ghc/ticket/10793 sleep 1 -- give it a little bit of time for the finally to run -- no big deal, since the blocked indefinitely takes a while to fire anyway res <- waitBarrierMaybe done case res of Just (Left e) -> throwIO e _ -> throwIO BlockedIndefinitelyOnMVar flip finally cleanup $ handle (\BlockedIndefinitelyOnMVar -> ghc10793) $ do addPool PoolStart pool $ act pool res <- waitBarrier done case res of Left e -> throwIO e Right s -> addTiming $ "Pool finished (" ++ show (threadsSum s) ++ " threads, " ++ show (threadsMax s) ++ " max)" shake-0.19.8/src/General/Process.hs0000644000000000000000000002213107346545000015212 0ustar0000000000000000{-# LANGUAGE LambdaCase #-} {-# LANGUAGE RecordWildCards #-} -- | A wrapping of createProcess to provide a more flexible interface. module General.Process( Buffer, newBuffer, readBuffer, process, ProcessOpts(..), Source(..), Destination(..) ) where import Control.Concurrent.Extra import Control.DeepSeq import Control.Exception.Extra as C import Control.Monad.Extra import Data.List.Extra import Data.Maybe import Foreign.C.Error import System.Exit import System.IO.Extra import System.Info.Extra import System.Process import System.Time.Extra import Data.Unique import Data.IORef.Extra import qualified Data.ByteString.Char8 as BS import qualified Data.ByteString.Lazy as LBS import General.Extra import Development.Shake.Internal.Errors import GHC.IO.Exception (IOErrorType(..), IOException(..)) --------------------------------------------------------------------- -- BUFFER ABSTRACTION data Buffer a = Buffer Unique (IORef [a]) instance Eq (Buffer a) where Buffer x _ == Buffer y _ = x == y instance Ord (Buffer a) where compare (Buffer x _) (Buffer y _) = compare x y newBuffer :: IO (Buffer a) newBuffer = liftM2 Buffer newUnique (newIORef []) addBuffer :: Buffer a -> a -> IO () addBuffer (Buffer _ ref) x = atomicModifyIORef_ ref (x:) readBuffer :: Buffer a -> IO [a] readBuffer (Buffer _ ref) = reverse <$> readIORef ref --------------------------------------------------------------------- -- OPTIONS data Source = SrcFile FilePath | SrcString String | SrcBytes LBS.ByteString | SrcInherit data Destination = DestEcho | DestFile FilePath | DestString (Buffer String) | DestBytes (Buffer BS.ByteString) deriving (Eq,Ord) isDestString DestString{} = True; isDestString _ = False isDestBytes DestBytes{} = True; isDestBytes _ = False data ProcessOpts = ProcessOpts {poCommand :: CmdSpec ,poCwd :: Maybe FilePath ,poEnv :: Maybe [(String, String)] ,poTimeout :: Maybe Double ,poStdin :: [Source] ,poStdout :: [Destination] ,poStderr :: [Destination] ,poAsync :: Bool ,poCloseFds :: Bool ,poGroup :: Bool } --------------------------------------------------------------------- -- IMPLEMENTATION -- | If two buffers can be replaced by one and a copy, do that (only if they start empty) optimiseBuffers :: ProcessOpts -> IO (ProcessOpts, IO ()) optimiseBuffers po@ProcessOpts{..} = pure (po{poStdout = nubOrd poStdout, poStderr = nubOrd poStderr}, pure ()) stdStream :: (FilePath -> Handle) -> [Destination] -> [Destination] -> StdStream stdStream _ [DestEcho] _ = Inherit stdStream file [DestFile x] other | other == [DestFile x] || DestFile x `notElem` other = UseHandle $ file x stdStream _ _ _ = CreatePipe stdIn :: (FilePath -> Handle) -> [Source] -> (StdStream, Handle -> IO ()) stdIn _ [SrcInherit] = (Inherit, const $ pure ()) stdIn file [SrcFile x] = (UseHandle $ file x, const $ pure ()) stdIn file src = (,) CreatePipe $ \h -> ignoreSigPipe $ do forM_ src $ \case SrcString x -> hPutStr h x SrcBytes x -> LBS.hPutStr h x SrcFile x -> LBS.hPutStr h =<< LBS.hGetContents (file x) SrcInherit -> pure () -- Can't both inherit and set it hClose h ignoreSigPipe :: IO () -> IO () ignoreSigPipe = handleIO $ \e -> case e of IOError {ioe_type=ResourceVanished, ioe_errno=Just ioe} | Errno ioe == ePIPE -> pure () _ -> throwIO e withExceptions :: IO () -> IO a -> IO a withExceptions stop go = do bar <- newBarrier v <- mask $ \unmask -> do forkFinally (unmask go) $ signalBarrier bar unmask (waitBarrier bar) `onException` do forkIO stop waitBarrier bar either throwIO pure v withTimeout :: Maybe Double -> IO () -> IO a -> IO a withTimeout Nothing _ go = go withTimeout (Just s) stop go = bracket (forkIO $ sleep s >> stop) killThread $ const go cmdSpec :: CmdSpec -> CreateProcess cmdSpec (ShellCommand x) = shell x cmdSpec (RawCommand x xs) = proc x xs forkWait :: IO a -> IO (IO a) forkWait a = do res <- newEmptyMVar _ <- mask $ \restore -> forkIO $ try_ (restore a) >>= putMVar res pure $ takeMVar res >>= either throwIO pure abort :: Bool -> ProcessHandle -> IO () abort poGroup pid = do when poGroup $ do interruptProcessGroupOf pid sleep 3 -- give the process a few seconds grace period to die nicely terminateProcess pid withFiles :: IOMode -> [FilePath] -> ((FilePath -> Handle) -> IO a) -> IO a withFiles mode files act = withs (map (`withFile` mode) files) $ \handles -> act $ \x -> fromJust $ lookup x $ zipExact files handles -- General approach taken from readProcessWithExitCode process :: ProcessOpts -> IO (ProcessHandle, ExitCode) process po = do (ProcessOpts{..}, flushBuffers) <- optimiseBuffers po let outFiles = nubOrd [x | DestFile x <- poStdout ++ poStderr] let inFiles = nubOrd [x | SrcFile x <- poStdin] withFiles WriteMode outFiles $ \outHandle -> withFiles ReadMode inFiles $ \inHandle -> do let cp = (cmdSpec poCommand){cwd = poCwd, env = poEnv, create_group = poGroup, close_fds = poCloseFds ,std_in = fst $ stdIn inHandle poStdin ,std_out = stdStream outHandle poStdout poStderr, std_err = stdStream outHandle poStderr poStdout} withCreateProcessCompat cp $ \inh outh errh pid -> withTimeout poTimeout (abort poGroup pid) $ withExceptions (abort poGroup pid) $ do let streams = [(outh, stdout, poStdout) | Just outh <- [outh], CreatePipe <- [std_out cp]] ++ [(errh, stderr, poStderr) | Just errh <- [errh], CreatePipe <- [std_err cp]] wait <- forM streams $ \(h, hh, dest) -> do -- no point tying the streams together if one is being streamed directly let isTied = not (poStdout `disjoint` poStderr) && length streams == 2 let isBinary = any isDestBytes dest || not (any isDestString dest) when isTied $ hSetBuffering h LineBuffering when (DestEcho `elem` dest) $ do buf <- hGetBuffering hh case buf of BlockBuffering{} -> pure () _ -> hSetBuffering h buf if isBinary then do hSetBinaryMode h True dest<- pure $ flip map dest $ \case DestEcho -> BS.hPut hh DestFile x -> BS.hPut (outHandle x) DestString x -> addBuffer x . (if isWindows then replace "\r\n" "\n" else id) . BS.unpack DestBytes x -> addBuffer x forkWait $ whileM $ do src <- BS.hGetSome h 4096 mapM_ ($ src) dest notM $ hIsEOF h else if isTied then do dest<- pure $ flip map dest $ \case DestEcho -> hPutStrLn hh DestFile x -> hPutStrLn (outHandle x) DestString x -> addBuffer x . (++ "\n") DestBytes{} -> throwImpure $ errorInternal "Not reachable due to isBinary condition" forkWait $ whileM $ ifM (hIsEOF h) (pure False) $ do src <- hGetLine h mapM_ ($ src) dest pure True else do src <- hGetContents h wait1 <- forkWait $ C.evaluate $ rnf src waits <- forM dest $ \case DestEcho -> forkWait $ hPutStr hh src DestFile x -> forkWait $ hPutStr (outHandle x) src DestString x -> do addBuffer x src; pure $ pure () DestBytes{} -> throwImpure $ errorInternal "Not reachable due to isBinary condition" pure $ sequence_ $ wait1 : waits whenJust inh $ snd $ stdIn inHandle poStdin if poAsync then pure (pid, ExitSuccess) else do sequence_ wait flushBuffers res <- waitForProcess pid whenJust outh hClose whenJust errh hClose pure (pid, res) --------------------------------------------------------------------- -- COMPATIBILITY -- available in process-1.4.3.0, GHC ??? (Nov 2015) -- logic copied directly (apart from Ctrl-C handling magic using internal pieces) withCreateProcessCompat :: CreateProcess -> (Maybe Handle -> Maybe Handle -> Maybe Handle -> ProcessHandle -> IO a) -> IO a withCreateProcessCompat cp act = bracketOnError (createProcess cp) cleanup (\(m_in, m_out, m_err, ph) -> act m_in m_out m_err ph) where cleanup (inh, outh, errh, pid) = do terminateProcess pid whenJust inh $ ignoreSigPipe . hClose whenJust outh hClose whenJust errh hClose forkIO $ void $ waitForProcess pid shake-0.19.8/src/General/Template.hs0000644000000000000000000000741207346545000015354 0ustar0000000000000000{-# LANGUAGE CPP #-} {-# LANGUAGE ViewPatterns #-} -- Keep using iso8601DateFormat, since the alternative was introduced in time-1.9 -- while GHC 8.6 still has time-1.8. -- Safe once we no longer support GHC 8.6. {-# OPTIONS_GHC -Wno-deprecations #-} #ifdef FILE_EMBED {-# LANGUAGE TemplateHaskell #-} #endif module General.Template(runTemplate) where import System.FilePath.Posix import Control.Exception.Extra import Data.Char import Data.Time import System.IO.Unsafe import Development.Shake.Internal.Paths import qualified Data.ByteString.Lazy.Char8 as LBS import qualified Language.Javascript.DGTable as DGTable import qualified Language.Javascript.Flot as Flot import qualified Language.Javascript.JQuery as JQuery #ifdef FILE_EMBED import Data.FileEmbed import Language.Haskell.TH.Syntax ( runIO ) #endif {- HLINT ignore "Redundant bracket" -} -- a result of CPP expansion -- Very hard to abstract over TH, so we do it with CPP #ifdef FILE_EMBED #define FILE(x) (pure (LBS.fromStrict $(embedFile =<< runIO (x)))) #else #define FILE(x) (LBS.readFile =<< (x)) #endif libraries :: [(String, IO LBS.ByteString)] libraries = [("jquery.js", FILE(JQuery.file)) ,("jquery.dgtable.js", FILE(DGTable.file)) ,("jquery.flot.js", FILE(Flot.file Flot.Flot)) ,("jquery.flot.stack.js", FILE(Flot.file Flot.FlotStack)) ] -- | Template Engine. Perform the following replacements on a line basis: -- -- * ==> -- -- * ==> runTemplate :: (FilePath -> IO LBS.ByteString) -> LBS.ByteString -> IO LBS.ByteString runTemplate ask = lbsMapLinesIO f where link = LBS.pack "\n" `LBS.append` res `LBS.append` LBS.pack "\n" | Just file <- lbsStripPrefix link y = do res <- grab file; pure $ LBS.pack "" | otherwise = pure x where y = LBS.dropWhile isSpace x grab = asker . takeWhile (/= '\"') . LBS.unpack asker o@(splitFileName -> ("lib/",x)) = case lookup x libraries of Nothing -> errorIO $ "Template library, unknown library: " ++ o Just act -> act asker "shake.js" = readDataFileHTML "shake.js" asker "data/metadata.js" = do time <- getCurrentTime pure $ LBS.pack $ "var version = " ++ show shakeVersionString ++ "\nvar generated = " ++ show (formatTime defaultTimeLocale (iso8601DateFormat (Just "%H:%M:%S")) time) asker x = ask x -- Perform a mapM on each line and put the result back together again lbsMapLinesIO :: (LBS.ByteString -> IO LBS.ByteString) -> LBS.ByteString -> IO LBS.ByteString -- If we do the obvious @fmap LBS.unlines . mapM f@ then all the monadic actions are run on all the lines -- before it starts producing the lazy result, killing streaming and having more stack usage. -- The real solution (albeit with too many dependencies for something small) is a streaming library, -- but a little bit of unsafePerformIO does the trick too. lbsMapLinesIO f = pure . LBS.unlines . map (unsafePerformIO . f) . LBS.lines --------------------------------------------------------------------- -- COMPATIBILITY -- available in bytestring-0.10.8.0, GHC 8.0 and above -- alternative implementation below lbsStripPrefix :: LBS.ByteString -> LBS.ByteString -> Maybe LBS.ByteString lbsStripPrefix prefix text = if a == prefix then Just b else Nothing where (a,b) = LBS.splitAt (LBS.length prefix) text shake-0.19.8/src/General/Thread.hs0000644000000000000000000000654407346545000015015 0ustar0000000000000000{-# LANGUAGE ScopedTypeVariables #-} -- | A bit like 'Fence', but not thread safe and optimised for avoiding taking the fence module General.Thread( withThreadsBoth, withThreadSlave, allocateThread, Thread, newThreadFinally, stopThreads ) where import General.Cleanup import Data.Hashable import Control.Concurrent.Extra import Control.Exception import General.Extra import Control.Monad.Extra data Thread = Thread ThreadId (Barrier ()) instance Eq Thread where Thread a _ == Thread b _ = a == b instance Hashable Thread where hashWithSalt salt (Thread a _) = hashWithSalt salt a -- | The inner thread is unmasked even if you started masked. newThreadFinally :: IO a -> (Thread -> Either SomeException a -> IO ()) -> IO Thread newThreadFinally act cleanup = do bar <- newBarrier t <- mask_ $ forkIOWithUnmask $ \unmask -> flip finally (signalBarrier bar ()) $ do res <- try $ unmask act me <- myThreadId cleanup (Thread me bar) res pure $ Thread t bar stopThreads :: [Thread] -> IO () stopThreads threads = do -- if a thread is in a masked action, killing it may take some time, so kill them in parallel bars <- sequence [do forkIO $ killThread t; pure bar | Thread t bar <- threads] mapM_ waitBarrier bars -- Run both actions. If either throws an exception, both threads -- are killed and an exception reraised. -- Not called much, so simplicity over performance (2 threads). withThreadsBoth :: IO a -> IO b -> IO (a, b) withThreadsBoth act1 act2 = do bar1 <- newBarrier bar2 <- newBarrier parent <- myThreadId ignore <- newVar False mask $ \unmask -> do t1 <- forkIOWithUnmask $ \unmask -> do res1 :: Either SomeException a <- try $ unmask act1 unlessM (readVar ignore) $ whenLeft res1 $ throwTo parent signalBarrier bar1 res1 t2 <- forkIOWithUnmask $ \unmask -> do res2 :: Either SomeException b <- try $ unmask act2 unlessM (readVar ignore) $ whenLeft res2 $ throwTo parent signalBarrier bar2 res2 res :: Either SomeException (a,b) <- try $ unmask $ do Right v1 <- waitBarrier bar1 Right v2 <- waitBarrier bar2 pure (v1,v2) writeVar ignore True killThread t1 forkIO $ killThread t2 waitBarrier bar1 waitBarrier bar2 either throwIO pure res -- | Run an action in a separate thread. -- After the first action terminates, the thread will be killed. -- If the action raises an exception it will be rethrown on the parent thread. withThreadSlave :: IO () -> IO a -> IO a withThreadSlave slave act = withCleanup $ \cleanup -> do allocateThread cleanup slave act -- | Run the given action in a separate thread. -- On cleanup, the thread will be killed before continuing. -- If the action raises an exception it will be rethrown on the parent thread. allocateThread :: Cleanup -> IO () -> IO () allocateThread cleanup act = do bar <- newBarrier parent <- myThreadId ignore <- newVar False void $ allocate cleanup (mask_ $ forkIOWithUnmask $ \unmask -> do res :: Either SomeException () <- try $ unmask act unlessM (readVar ignore) $ whenLeft res $ throwTo parent signalBarrier bar () ) (\t -> do writeVar ignore True; killThread t; waitBarrier bar) shake-0.19.8/src/General/Timing.hs0000644000000000000000000000335107346545000015026 0ustar0000000000000000 module General.Timing(resetTimings, addTiming, getTimings) where import Data.IORef.Extra import System.IO.Unsafe import Data.Tuple.Extra import Data.List.Extra import Numeric.Extra import General.Extra import System.Time.Extra {-# NOINLINE timer #-} timer :: IO Seconds timer = unsafePerformIO offsetTime {-# NOINLINE timings #-} timings :: IORef [(Seconds, String)] -- number of times called, newest first timings = unsafePerformIO $ newIORef [] resetTimings :: IO () resetTimings = do now <- timer writeIORef timings [(now, "Start")] -- | Print all withTiming information and clear it. getTimings :: IO [String] getTimings = do now <- timer old <- atomicModifyIORef timings dupe pure $ showTimings now $ reverse old addTiming :: String -> IO () addTiming msg = do now <- timer atomicModifyIORef_ timings ((now,msg):) showTimings :: Seconds -> [(Seconds, String)] -> [String] showTimings _ [] = [] showTimings stop times = showGap $ [(a ++ " ", showDP 3 b ++ "s " ++ showPerc b ++ " " ++ progress b) | (a,b) <- xs] ++ [("Total", showDP 3 sm ++ "s " ++ showPerc sm ++ " " ++ replicate 25 ' ')] where a // b = if b == 0 then 0 else a / b showPerc x = let s = show $ floor $ x * 100 // sm in replicate (3 - length s) ' ' ++ s ++ "%" progress x = let i = floor $ x * 25 // mx in replicate i '=' ++ replicate (25-i) ' ' mx = maximum $ map snd xs sm = sum $ map snd xs xs = [ (name, stop - start) | ((start, name), stop) <- zipExact times $ map fst (drop1 times) ++ [stop]] showGap :: [(String,String)] -> [String] showGap xs = [a ++ replicate (n - length a - length b) ' ' ++ b | (a,b) <- xs] where n = maximum [length a + length b | (a,b) <- xs] shake-0.19.8/src/General/TypeMap.hs0000644000000000000000000000247207346545000015161 0ustar0000000000000000{-# LANGUAGE ExistentialQuantification, ConstraintKinds, KindSignatures, GADTs, ScopedTypeVariables, Rank2Types #-} module General.TypeMap( Map, empty, singleton, insert, map, lookup, unionWith, toList, size ) where import qualified Data.HashMap.Strict as Map import Data.Typeable import Unsafe.Coerce import Data.Functor import qualified Prelude import Prelude hiding (lookup, map) data F f = forall a . F !(f a) unF :: F f -> f a unF x = case x of F x -> unsafeCoerce x newtype Map f = Map (Map.HashMap TypeRep (F f)) empty :: Map f empty = Map Map.empty singleton :: Typeable a => f a -> Map f singleton x = Map $ Map.singleton (typeRep x) (F x) insert :: Typeable a => f a -> Map f -> Map f insert x (Map mp) = Map $ Map.insert (typeRep x) (F x) mp lookup :: forall a f . Typeable a => Map f -> Maybe (f a) lookup (Map mp) = unF <$> Map.lookup (typeRep (Proxy :: Proxy a)) mp unionWith :: (forall a . f a -> f a -> f a) -> Map f -> Map f -> Map f unionWith f (Map mp1) (Map mp2) = Map $ Map.unionWith (\x1 x2 -> F $ f (unF x1) (unF x2)) mp1 mp2 map :: (forall a . f1 a -> f2 a) -> Map f1 -> Map f2 map f (Map mp) = Map $ Map.map (\(F a) -> F $ f a) mp toList :: (forall a . f a -> b) -> Map f -> [b] toList f (Map mp) = Prelude.map (\(F a) -> f a) $ Map.elems mp size :: Map f -> Int size (Map mp) = Map.size mp shake-0.19.8/src/General/Wait.hs0000644000000000000000000001101707346545000014501 0ustar0000000000000000{-# LANGUAGE BangPatterns #-} {-# LANGUAGE DeriveFunctor #-} -- | A bit like 'Fence', but not thread safe and optimised for avoiding taking the fence module General.Wait( Wait(Now,Later), runWait, quickly, fromLater, firstJustWaitUnordered, firstLeftWaitUnordered ) where import Control.Monad.Extra import Control.Monad.IO.Class import Data.IORef.Extra import Data.List.Extra import Data.Primitive.Array import GHC.Exts(RealWorld) import Control.Monad.Fail import Prelude runWait :: Monad m => Wait m a -> m (Wait m a) runWait (Lift x) = runWait =<< x runWait x = pure x fromLater :: Monad m => Wait m a -> (a -> m ()) -> m () fromLater (Lift x) f = do x <- x; fromLater x f fromLater (Now x) f = f x fromLater (Later x) f = x f quickly :: Functor m => m a -> Wait m a quickly = Lift . fmap Now data Wait m a = Now a | Lift (m (Wait m a)) | Later ((a -> m ()) -> m ()) deriving Functor instance (Monad m, Applicative m) => Applicative (Wait m) where pure = Now Now x <*> y = x <$> y Lift x <*> y = Lift $ (<*> y) <$> x Later x <*> Now y = Later $ \c -> x $ \x -> c $ x y -- Note: We pull the Lift from the right BEFORE the Later, to enable parallelism Later x <*> Lift y = Lift $ do y <- y; pure $ Later x <*> y Later x <*> Later y = Later $ \c -> x $ \x -> y $ \y -> c $ x y instance (Monad m, Applicative m) => Monad (Wait m) where return = pure (>>) = (*>) Now x >>= f = f x Lift x >>= f = Lift $ do x <- x; pure $ x >>= f Later x >>= f = Later $ \c -> x $ \x -> do x <- runWait $ f x case x of Now x -> c x _ -> fromLater x c instance (MonadIO m, Applicative m) => MonadIO (Wait m) where liftIO = Lift . liftIO . fmap Now instance MonadFail m => MonadFail (Wait m) where fail = Lift . Control.Monad.Fail.fail firstJustWaitUnordered :: MonadIO m => (a -> Wait m (Maybe b)) -> [a] -> Wait m (Maybe b) firstJustWaitUnordered f = go 0 [] . map f where -- keep a list of those things we might visit later, and ask for each we see in turn go :: MonadIO m => Int -> [(Maybe a -> m ()) -> m ()] -> [Wait m (Maybe a)] -> Wait m (Maybe a) go !nlater later (x:xs) = case x of Now (Just a) -> Now $ Just a Now Nothing -> go nlater later xs Later l -> go (succ nlater) (l:later) xs Lift x -> Lift $ do x <- x pure $ go nlater later (x:xs) go _ [] [] = Now Nothing go _ [l] [] = Later l go nls ls [] = Later $ \callback -> do ref <- liftIO $ newIORef nls forM_ ls $ \l -> l $ \r -> do old <- liftIO $ readIORef ref when (old > 0) $ case r of Just a -> do liftIO $ writeIORef' ref 0 callback $ Just a Nothing -> do liftIO $ writeIORef' ref $ old-1 when (old == 1) $ callback Nothing firstLeftWaitUnordered :: MonadIO m => (a -> Wait m (Either e b)) -> [a] -> Wait m (Either e [b]) firstLeftWaitUnordered f xs = do let n = length xs mut <- liftIO $ newArray n undefined res <- go mut [] $ zipFrom 0 $ map f xs case res of Just e -> pure $ Left e Nothing -> liftIO $ Right <$> mapM (readArray mut) [0..n-1] where -- keep a list of those things we might visit later, and ask for each we see in turn go :: MonadIO m => MutableArray RealWorld b -> [(Int, (Either e b -> m ()) -> m ())] -> [(Int, Wait m (Either e b))] -> Wait m (Maybe e) go mut later ((i,x):xs) = case x of Now (Left e) -> Now $ Just e Now (Right b) -> do liftIO $ writeArray mut i b go mut later xs Later l -> go mut ((i,l):later) xs Lift x -> Lift $ do x <- x pure $ go mut later ((i,x):xs) go _ [] [] = Now Nothing go mut ls [] = Later $ \callback -> do ref <- liftIO $ newIORef $ length ls forM_ ls $ \(i,l) -> l $ \r -> do old <- liftIO $ readIORef ref when (old > 0) $ case r of Left a -> do liftIO $ writeIORef' ref 0 callback $ Just a Right v -> do liftIO $ writeArray mut i v liftIO $ writeIORef' ref $ old-1 when (old == 1) $ callback Nothing shake-0.19.8/src/0000755000000000000000000000000007346545000011664 5ustar0000000000000000shake-0.19.8/src/Paths.hs0000644000000000000000000000052107346545000013275 0ustar0000000000000000-- | Fake cabal module for local building module Paths_shake(getDataDir, version) where import Data.Version.Extra -- If Shake can't find files in the data directory it tries relative to the executable getDataDir :: IO FilePath getDataDir = pure "random_path_that_cannot_possibly_exist" version :: Version version = makeVersion [0,0] shake-0.19.8/src/Run.hs0000644000000000000000000000370007346545000012764 0ustar0000000000000000 module Run(main) where import Development.Ninja.All import System.Environment import Development.Shake import Development.Shake.FilePath import General.Timing(resetTimings) import Control.Monad.Extra import Control.Exception.Extra import Data.Maybe import qualified System.Directory as IO import General.Extra import General.GetOpt import System.Process import System.Exit import Data.Either.Extra main :: IO () main = do resetTimings args <- getArgs hsExe <- findFile [".shake" "shake" <.> exe ,"Shakefile.hs","Shakefile.lhs"] case hsExe of Just file -> do (prog,args)<- pure $ if takeExtension file `elem` [".hs",".lhs"] then ("runhaskell", file:args) else (toNative file, args) e <- rawSystem prog args when (e /= ExitSuccess) $ exitWith e Nothing -> do let opts = shakeOptions{shakeThreads=0,shakeCreationCheck=False,shakeNeedDirectory=True} let go = shakeArgsWith opts flags $ \opts targets -> do let tool = listToMaybe [x | Tool x <- opts] makefile <- case reverse [x | UseMakefile x <- opts] of x:_ -> pure x _ -> do res <- findFile ["build.ninja"] case res of Just x -> pure x Nothing -> errorIO "Could not find `build.ninja'" runNinja go makefile targets tool withArgs ("--no-time":args) go data Flag = UseMakefile FilePath | Tool String flags = [Option "f" ["file","makefile"] (ReqArg (Right . UseMakefile) "FILE") "Read FILE as a makefile." ,Option "t" ["tool"] (ReqArg (Right . Tool) "TOOL") "Ninja-compatible tools." ] findFile :: [FilePath] -> IO (Maybe FilePath) findFile = findM (fmap (fromRight False) . tryIO . IO.doesFileExist) shake-0.19.8/src/Test.hs0000644000000000000000000001342607346545000013145 0ustar0000000000000000 module Test(main) where import Control.Exception.Extra import Control.Monad.Extra import Data.Maybe import Data.List.Extra import System.Directory import System.Environment import General.Timing import Development.Shake.Internal.FileInfo import Development.Shake.Internal.FileName import qualified Data.ByteString.Char8 as BS import Test.Type(sleepFileTimeCalibrate) import Control.Concurrent.Extra import qualified Test.Basic import qualified Test.Batch import qualified Test.Benchmark import qualified Test.Builtin import qualified Test.BuiltinOverride import qualified Test.C import qualified Test.Cache import qualified Test.Cleanup import qualified Test.CloseFileHandles import qualified Test.Command import qualified Test.Config import qualified Test.Database import qualified Test.Digest import qualified Test.Directory import qualified Test.Docs import qualified Test.Errors import qualified Test.Existence import qualified Test.FileLock import qualified Test.FilePath import qualified Test.FilePattern import qualified Test.Files import qualified Test.Forward import qualified Test.History import qualified Test.Journal import qualified Test.Lint import qualified Test.Live import qualified Test.Manual import qualified Test.Match import qualified Test.Monad import qualified Test.Ninja import qualified Test.Oracle import qualified Test.OrderOnly import qualified Test.Parallel import qualified Test.Pool import qualified Test.Progress import qualified Test.Random import qualified Test.Rebuild import qualified Test.Reschedule import qualified Test.Resources import qualified Test.Self import qualified Test.SelfMake import qualified Test.Tar import qualified Test.Targets import qualified Test.Thread import qualified Test.Tup import qualified Test.Unicode import qualified Test.Util import qualified Test.Verbosity import qualified Test.Version import qualified Run fakes = ["clean" * clean, "test" * test, "make" * makefile, "filetime" * filetime] where (*) = (,) mains = ["basic" * Test.Basic.main ,"batch" * Test.Batch.main ,"benchmark" * Test.Benchmark.main ,"builtin" * Test.Builtin.main ,"builtinOverride" * Test.BuiltinOverride.main ,"c" * Test.C.main ,"cache" * Test.Cache.main ,"cleanup" * Test.Cleanup.main ,"closefilehandles" * Test.CloseFileHandles.main ,"command" * Test.Command.main ,"config" * Test.Config.main ,"database" * Test.Database.main ,"digest" * Test.Digest.main ,"directory" * Test.Directory.main ,"docs" * Test.Docs.main ,"errors" * Test.Errors.main ,"existence" * Test.Existence.main ,"filelock" * Test.FileLock.main ,"filepath" * Test.FilePath.main ,"filepattern" * Test.FilePattern.main ,"files" * Test.Files.main ,"forward" * Test.Forward.main ,"history" * Test.History.main ,"journal" * Test.Journal.main ,"lint" * Test.Lint.main ,"live" * Test.Live.main ,"manual" * Test.Manual.main ,"match" * Test.Match.main ,"monad" * Test.Monad.main ,"ninja" * Test.Ninja.main ,"oracle" * Test.Oracle.main ,"orderonly" * Test.OrderOnly.main ,"parallel" * Test.Parallel.main ,"pool" * Test.Pool.main ,"progress" * Test.Progress.main ,"random" * Test.Random.main ,"rebuild" * Test.Rebuild.main ,"reschedule" * Test.Reschedule.main ,"resources" * Test.Resources.main ,"self" * Test.Self.main ,"selfmake" * Test.SelfMake.main ,"tar" * Test.Tar.main ,"targets" * Test.Targets.main ,"thread" * Test.Thread.main ,"tup" * Test.Tup.main ,"unicode" * Test.Unicode.main ,"util" * Test.Util.main ,"verbosity" * Test.Verbosity.main ,"version" * Test.Version.main] where (*) = (,) main :: IO () main = do resetTimings xs <- getArgs case flip lookup (fakes ++ mains) =<< listToMaybe xs of _ | null xs -> do putStrLn "******************************************************************" putStrLn "** Running shake test suite, run with '--help' to see arguments **" putStrLn "******************************************************************" unlessM (doesFileExist "shake.cabal") $ do putStrLn "" errorIO "\nERROR: Must run the test suite from a directory containing the Shake repo." withArgs ["test"] main withArgs ["random","test","3m"] main Nothing -> putStrLn $ unlines ["Welcome to the Shake demo" ,"" ,unwords $ "Modes:" : map fst fakes ,unwords $ "Demos:" : map fst mains ,"" ,"As an example, try:" ,"" ," shake-test self --jobs=2 --trace" ,"" ,"Which will build Shake, using Shake, on 2 threads." ,"You must run the test suite from a directory containing the Shake repo." ] Just main -> main =<< sleepFileTimeCalibrate "output/calibrate" makefile :: IO () -> IO () makefile _ = do args <- getArgs withArgs (drop1 args) Run.main filetime :: IO () -> IO () filetime _ = do args <- getArgs addTiming "Reading files" files <- concatForM (drop1 args) $ \file -> BS.lines . BS.filter (/= '\r') <$> BS.readFile file let n = length files evaluate n addTiming "Modtime" let (a,bcd) = splitAt (n `div` 4) files let (b,cd) = splitAt (n `div` 4) bcd let (c,d) = splitAt (n `div` 4) cd vars <- forM [a,b,c,d] $ \xs -> onceFork $ mapM_ (getFileInfo False . fileNameFromByteString) xs sequence_ vars clean :: IO () -> IO () clean extra = sequence_ [withArgs [name,"clean"] $ main extra | (name,main) <- mains] test :: IO () -> IO () test yield = do args <- getArgs flip onException (putStrLn "TESTS FAILED") $ sequence_ [withArgs (name:"test":drop1 args) $ test yield | (name,test) <- mains, name /= "random"] shake-0.19.8/src/Test/0000755000000000000000000000000007346545000012603 5ustar0000000000000000shake-0.19.8/src/Test/Basic.hs0000644000000000000000000001445107346545000014165 0ustar0000000000000000 module Test.Basic(main) where import Development.Shake import System.FilePath import Test.Type import System.Directory as IO import Data.List import Control.Monad import General.Extra main = testBuild test $ do "AB.txt" %> \out -> do need ["A.txt", "B.txt"] text1 <- readFile' "A.txt" text2 <- readFile' "B.txt" writeFile' out $ text1 ++ text2 "twice.txt" %> \out -> do let src = "once.txt" need [src, src] copyFile' src out "once.txt" %> \out -> do src <- readFile' "zero.txt" writeFile' out src phonys $ \x -> if x /= "halfclean" then Nothing else Just $ removeFilesAfter "dir" ["//*e.txt"] phony "cleaner" $ removeFilesAfter "dir" ["//*"] phony "cleandb" $ removeFilesAfter "." [".shake.database"] phony "configure" $ liftIO $ appendFile "configure" "1" phony "install" $ do need ["configure","once.txt"] liftIO $ appendFile "install" "1" phony "duplicate1" $ need ["duplicate2","duplicate3"] phony "duplicate2" $ need ["duplicate3"] phony "duplicate3" $ liftIO $ appendFile "duplicate" "1" phony "dummy" $ liftIO $ appendFile "dummy" "1" phony "threads" $ do x <- getShakeOptions writeFile' "threads.txt" $ show $ shakeThreads x phony ("slash" "platform") $ pure () phony "slash/forward" $ pure () phony "options" $ do opts <- getShakeOptions putInfo $ show opts "dummer.txt" %> \out -> do need ["dummy","dummy"] need ["dummy"] liftIO $ appendFile out "1" r <- newResource ".log file" 1 let trace x = withResource r 1 $ liftIO $ appendFile ".log" x "*.par" %> \out -> do trace "[" (if "unsafe" `isInfixOf` out then unsafeExtraThread else id) $ liftIO $ sleep 0.1 trace "]" writeFile' out out "sep" "1.txt" %> \out -> writeFile' out "" "sep/2.txt" %> \out -> writeFile' out "" ["sep" "3.txt", "sep" "4.txt", "sep" "5.*", "sep/6.txt"] |%> \out -> writeFile' out "" ["sep" "7.txt"] |%> \out -> writeFile' out "" "ids/source" %> \_ -> pure () "ids/out" %> \out -> do need =<< readFileLines "ids/source"; writeFile' out "" "ids/*" %> \out -> do alwaysRerun; trace (takeFileName out); writeFile' out $ takeFileName out "rerun" %> \out -> do alwaysRerun; liftIO $ appendFile out "." phony "foo" $ liftIO $ createDirectoryRecursive "foo" phony "ordering2" $ liftIO $ appendFile "order.log" "X" phony "ordering" $ do liftIO $ appendFile "order.log" "Y" need ["ordering2"] test build = do build ["clean"] writeFile "A.txt" "AAA" writeFile "B.txt" "BBB" build ["AB.txt","--sleep"] assertContents "AB.txt" "AAABBB" appendFile "A.txt" "aaa" build ["AB.txt"] assertContents "AB.txt" "AAAaaaBBB" removeFile "AB.txt" build ["AB.txt"] assertContents "AB.txt" "AAAaaaBBB" writeFile "zero.txt" "xxx" build ["twice.txt","--sleep"] assertContents "twice.txt" "xxx" writeFile "zero.txt" "yyy" build ["once.txt","--sleep"] assertContents "twice.txt" "xxx" assertContents "once.txt" "yyy" writeFile "zero.txt" "zzz" build ["once.txt","twice.txt","--sleep"] assertContents "twice.txt" "zzz" assertContents "once.txt" "zzz" removeFile "twice.txt" build ["twice.txt"] assertContents "twice.txt" "zzz" show shakeOptions === show shakeOptions build ["options"] createDirectoryRecursive "dir" writeFile "dir/ae.txt" "" writeFile "dir/ea.txt" "" build ["halfclean"] assertBoolIO (IO.doesDirectoryExist "dir") "Directory should exist, cleaner should not have removed it" build ["cleaner"] sleep 1 -- sometimes takes a while for the file system to notice assertBoolIO (not <$> IO.doesDirectoryExist "dir") "Directory should not exist, cleaner should have removed it" assertBoolIO (IO.doesFileExist ".shake.database") "Precondition not met" build ["cleandb"] assertBoolIO (not <$> IO.doesFileExist ".shake.database") "Postcondition not met" writeFile "zero.txt" "" writeFile "configure" "" writeFile "install" "" build ["configure"] build ["install"] build ["install"] assertContents "configure" "111" assertContents "install" "11" build ["dummy"] assertContents "dummy" "1" build ["dummy"] assertContents "dummy" "11" build ["dummy","dummy"] assertContents "dummy" "111" writeFile "dummer.txt" "" build ["dummer.txt"] assertContents "dummer.txt" "1" build ["dummer.txt"] assertContents "dummer.txt" "11" build ["1.par","2.par","-j1"] assertContents ".log" "[][]" writeFile ".log" "" build ["3.par","4.par","-j2"] assertContents ".log" "[[]]" writeFile ".log" "" processors <- getProcessorCount putStrLn $ "getProcessorCount returned " ++ show processors when (processors > 1) $ do build ["5.par","6.par","-j0"] assertContents ".log" "[[]]" writeFile ".log" "" build ["unsafe1.par","unsafe2.par","-j2"] assertContents ".log" "[[]]" build ["threads","-j3"] assertContents "threads.txt" "3" build ["threads","-j0"] assertContents "threads.txt" (show processors) writeFile "duplicate" "" build ["duplicate1","duplicate3"] assertContents "duplicate" "1" build $ concat [["sep/" ++ show i ++ ".txt", "sep" show i ++ ".txt"] | i <- [1..7]] build ["slash" "platform","slash" "forward"] build ["slash/platform","slash/forward"] createDirectoryRecursive "ids" writeFile "ids/source" "ids/a" build ["ids/out","--sleep"] writeFile ".log" "" writeFile "ids/source" "ids/b" build ["ids/out","-j4"] -- if you collapse depends to [Id] then this ends up asking for the stale 'a' assertContents ".log" "b" writeFile "rerun" "" build ["rerun"] assertContents "rerun" "." build ["rerun","rerun"] assertContents "rerun" ".." build ["foo"] build ["foo"] build [] -- should say "no want/action statements, nothing to do" (checked manually) -- #523, #524 - phony children should not run first writeFile "order.log" "" build ["ordering"] assertContents "order.log" "YX" build ["ordering"] assertContents "order.log" "YXYX" shake-0.19.8/src/Test/Batch.hs0000644000000000000000000000677507346545000014177 0ustar0000000000000000 module Test.Batch(main) where import Development.Shake import Development.Shake.FilePath import System.Directory import Data.List import General.Extra import Test.Type import Control.Monad main = testBuild test $ do let inp x = x -<.> "in" file <- newResource "log.txt" 1 batch 3 ("*.out" %>) (\out -> do need [inp out]; pure out) $ \outs -> do liftIO $ assertBool (length outs <= 3) "length outs <= 3" withResource file 1 $ liftIO $ appendFile "log.txt" $ show (length outs) ++ "\n" putInfo $ "Building batch: " ++ unwords outs forM_ outs $ \out -> liftIO $ copyFile (inp out) out want [show i <.> "out" | i <- [1..6]] "ABn.txt" %> \out -> do xs <- needHasChanged ["An.txt", "Bn.txt"] writeFileLines out xs ["An", "Bn"] &?%> \outs -> do xs <- needHasChanged $ map (-<.> "in") outs os <- mapM resultHasChanged outs forM_ (zip outs os) $ \(out, o) -> when (o || (out -<.> "in" `elem` xs)) $ writeFile' out "1" "On" %> \out -> do xs <- needHasChanged ["An", "Bn"] o <- resultHasChanged out writeFileLines out $ xs ++ ["On" | o] batch maxBound ("batch_max.*" %>) pure $ \outs -> forM_ outs $ \out -> writeFile' out $ show $ length outs phony "sleep2" $ liftIO $ sleep 2 batch 2 ("batch_profile.*" %>) (\x -> when ("1" `isSuffixOf` x) (liftIO $ sleep 1) >> pure x) $ \outs -> do liftIO $ sleep 2 need ["sleep2"] forM_ outs $ \out -> writeFile' out "" test build = do forM_ [1..6] $ \i -> writeFile (show i <.> "in") $ show i build ["--sleep","-j2"] assertBoolIO (do src <- readFile "log.txt"; pure $ length (lines src) < 6) "some batching" writeFile "log.txt" "" writeFile "2.in" "22" writeFile "5.in" "55" build [] assertContents "log.txt" "2\n" writeFile "An.txt" "1" writeFile "Bn.txt" "1" build ["ABn.txt", "--sleep"] assertContents "ABn.txt" "An.txt\nBn.txt\n" writeFile "An.txt" "1" build ["ABn.txt", "--sleep"] assertContents "ABn.txt" "An.txt\n" writeFile "Bn.txt" "1" build ["ABn.txt", "--sleep"] assertContents "ABn.txt" "Bn.txt\n" build ["ABn.txt", "--sleep"] assertContents "ABn.txt" "Bn.txt\n" writeFile "ABn.txt" "bogus" build ["ABn.txt", "--sleep"] assertContents "ABn.txt" "" writeFile "Bn.txt" "1" build ["Bn.txt", "--sleep"] build ["ABn.txt"] assertContents "ABn.txt" "Bn.txt\n" forM_ [[],["--usepredicate"]] $ \args -> do writeFile "An.in" "1" writeFile "Bn.in" "1" removeFile_ "On" build $ ["On", "--sleep"] ++ args assertContents "On" "An\nBn\nOn\n" writeFile "An.in" "1" build $ ["On", "--sleep"] ++ args assertContents "On" "An\n" writeFile "Bn.in" "1" build $ ["On", "--sleep"] ++ args assertContents "On" "Bn\n" build $ ["On", "--sleep"] ++ args assertContents "On" "Bn\n" removeFile "An" build $ ["On", "--sleep"] ++ args assertContents "On" "An\n" removeFile "An" writeFile "Bn.in" "2" build $ ["On", "--sleep"] ++ args assertContents "On" "An\nBn\n" removeFile "On" build $ ["On", "--sleep"] ++ args assertContents "On" "On\n" build ["batch_max." ++ show i | i <- [1..100]] assertContents "batch_max.72" "100" let names = ["batch_profile." ++ show i | i <- [1..2]] build names assertTimings build $ ("sleep2",2) : zip names [2,1] shake-0.19.8/src/Test/Benchmark.hs0000644000000000000000000000160207346545000015030 0ustar0000000000000000 module Test.Benchmark(main) where import General.GetOpt import Development.Shake import Test.Type import Text.Read import Data.List.Extra import Development.Shake.FilePath data Opts = Depth Int | Breadth Int opts = [Option "" ["depth" ] (ReqArg (fmap Depth . readEither) "INT") "" ,Option "" ["breadth"] (ReqArg (fmap Breadth . readEither) "INT") ""] -- | Given a breadth and depth come up with a set of build files main = testBuildArgs test opts $ \opts -> do let depth = lastDef 75 [x | Depth x <- opts] let breadth = lastDef 75 [x | Breadth x <- opts] want ["0." ++ show i | i <- [1..breadth]] "*" %> \out -> do let d = read $ takeBaseName out need [show (d + 1) ++ "." ++ show i | d < depth, i <- [1..breadth]] writeFile' out "" test build = do -- these help to test the stack limit build ["clean"] build [] build [] shake-0.19.8/src/Test/Builtin.hs0000644000000000000000000000414507346545000014551 0ustar0000000000000000{-# LANGUAGE TypeFamilies, GeneralizedNewtypeDeriving, DeriveDataTypeable #-} module Test.Builtin(main) where import Development.Shake import Development.Shake.Classes import Development.Shake.Rule import System.Directory as IO import qualified System.IO.Extra as IO import qualified Data.ByteString.Char8 as BS import Test.Type -- WARNING: This code is also reproduced in "Development.Shake.Rule" as documentation. -- If it needs editing, you probably need to edit it there too. newtype File = File FilePath deriving (Show,Eq,Hashable,Binary,NFData,Typeable) type instance RuleResult File = () data FileRule = FileRule File (Action ()) deriving Typeable addBuiltinFileRule :: Rules () addBuiltinFileRule = addBuiltinRule noLint noIdentity run where fileContents (File x) = do b <- IO.doesFileExist x; if b then IO.readFile' x else pure "" run :: BuiltinRun File () run key old mode = do now <- liftIO $ fileContents key if mode == RunDependenciesSame && fmap BS.unpack old == Just now then pure $ RunResult ChangedNothing (BS.pack now) () else do (_, act) <- getUserRuleOne key (const Nothing) $ \(FileRule k act) -> if k == key then Just act else Nothing act now <- liftIO $ fileContents key pure $ RunResult ChangedRecomputeDiff (BS.pack now) () fileRule :: FilePath -> Action () -> Rules () fileRule file act = addUserRule $ FileRule (File file) act fileNeed :: FilePath -> Action () fileNeed = apply1 . File main = testBuild test $ do addBuiltinFileRule fileRule "a.txt" $ pure () fileRule "b.txt" $ do fileNeed "a.txt" liftIO $ appendFile "log.txt" "X" liftIO $ writeFile "b.txt" . reverse =<< readFile "a.txt" action $ fileNeed "b.txt" test build = do writeFile "log.txt" "" writeFile "a.txt" "test" build [] assertContents "b.txt" "tset" assertContents "log.txt" "X" build [] assertContents "log.txt" "X" -- it doesn't rebuild writeFile "a.txt" "more" build [] assertContents "b.txt" "erom" shake-0.19.8/src/Test/BuiltinOverride.hs0000644000000000000000000000201107346545000016237 0ustar0000000000000000{-# LANGUAGE DeriveDataTypeable #-} {-# LANGUAGE GeneralizedNewtypeDeriving #-} {-# LANGUAGE TypeFamilies #-} module Test.BuiltinOverride (main) where import Control.Concurrent import Development.Shake import Development.Shake.Classes import Development.Shake.Rule import Test.Type newtype Key = Key Int deriving (Show, Eq, Hashable, Binary, NFData, Typeable) type instance RuleResult Key = () main sleep = do store <- newEmptyMVar testBuild (test store) (setRules store) sleep setRules resultsStore = do addBuiltinRule noLint noIdentity $ \(Key n) _ _ -> do liftIO $ putMVar resultsStore n pure $ RunResult ChangedRecomputeDiff mempty () addBuiltinRule noLint noIdentity $ \(Key n) _ _ -> do liftIO $ putMVar resultsStore (n + 1) pure $ RunResult ChangedRecomputeDiff mempty () action $ apply1 $ Key 1 test store build = do build ["--allow-redefine-rules"] res <- takeMVar store assertBool (res == 2) "Rule was not overridden" assertException ["rule defined twice"] $ build ["--quiet"] shake-0.19.8/src/Test/C.hs0000644000000000000000000000127107346545000013322 0ustar0000000000000000 module Test.C(main) where import Development.Shake import Development.Shake.FilePath import Test.Type main = testBuild defaultTest $ do let src = shakeRoot "src/Test/C" want ["Main.exe"] "Main.exe" %> \out -> do cs <- getDirectoryFiles src ["*.c"] let os = map (<.> "o") cs need os cmd "gcc -o" [out] os "*.c.o" %> \out -> do let c = src takeBaseName out need [c] headers <- cIncludes c need $ map (() src . takeFileName) headers cmd "gcc -o" [out] "-c" [c] cIncludes :: FilePath -> Action [FilePath] cIncludes x = do Stdout stdout <- cmd "gcc" ["-MM",x] pure $ drop 2 $ words stdout shake-0.19.8/src/Test/C/0000755000000000000000000000000007346545000012765 5ustar0000000000000000shake-0.19.8/src/Test/C/constants.c0000644000000000000000000000010607346545000015142 0ustar0000000000000000 char msg[] = "Hello Shake Users!"; char* message() { return msg; } shake-0.19.8/src/Test/C/constants.h0000644000000000000000000000002107346545000015143 0ustar0000000000000000char* message(); shake-0.19.8/src/Test/C/main.c0000644000000000000000000000014107346545000014051 0ustar0000000000000000#include #include "constants.h" int main() { printf("%s\n", message()); return 0; } shake-0.19.8/src/Test/Cache.hs0000644000000000000000000000406007346545000014142 0ustar0000000000000000 module Test.Cache(main) where import Development.Shake import Development.Shake.FilePath import System.Directory import Data.Char import Test.Type main = testBuild test $ do vowels <- newCache $ \file -> do src <- readFile' file liftIO $ appendFile "trace.txt" "1" pure $ length $ filter isDigit src "*.out*" %> \x -> writeFile' x . show =<< vowels (dropExtension x <.> "txt") startCompiler <- newCache $ \() -> do liftIO $ writeFile "compiler.txt" "on" runAfter $ writeFile "compiler.txt" "off" "*.lang" %> \out -> do startCompiler () liftIO $ copyFile "compiler.txt" out -- Bug fixed in https://github.com/ndmitchell/shake/pull/796 bug796_2 <- newCache $ \() -> do readFile' "bug796.2" "bug796" %> \out -> do a <- readFile' "bug796.1" b <- bug796_2 () writeFile' out $ a ++ b test build = do build ["clean"] writeFile "trace.txt" "" writeFile "vowels.txt" "abc123a" build ["vowels.out1","vowels.out2","-j3","--sleep"] assertContents "trace.txt" "1" assertContents "vowels.out1" "3" assertContents "vowels.out2" "3" build ["vowels.out2","-j3"] assertContents "trace.txt" "1" assertContents "vowels.out1" "3" writeFile "vowels.txt" "12xyz34" build ["vowels.out2","-j3","--sleep"] assertContents "trace.txt" "11" assertContents "vowels.out2" "4" build ["vowels.out1","-j3","--sleep"] assertContents "trace.txt" "111" assertContents "vowels.out1" "4" build ["foo.lang","bar.lang"] assertContents "foo.lang" "on" assertContents "compiler.txt" "off" writeFile "compiler.txt" "unstarted" build ["foo.lang","bar.lang"] assertContents "compiler.txt" "unstarted" writeFile "bug796.1" "a" writeFile "bug796.2" "b" build ["bug796", "--sleep"] assertContents "bug796" "ab" writeFile "bug796.1" "A" build ["bug796", "--sleep"] assertContents "bug796" "Ab" writeFile "bug796.2" "B" build ["bug796", "--sleep"] assertContents "bug796" "AB" shake-0.19.8/src/Test/Cleanup.hs0000644000000000000000000000333707346545000014534 0ustar0000000000000000{-# LANGUAGE ScopedTypeVariables, DeriveDataTypeable #-} -- Initially copied from https://raw.githubusercontent.com/snoyberg/conduit/master/resourcet/test/main.hs -- on 2018-10-11 module Test.Cleanup(main) where import General.Cleanup import Data.Typeable import Control.Monad import Data.IORef import Control.Exception import Test.Type main = testSimple $ do do -- survives releasing bottom x <- newIORef (0 :: Int) handle (\(_ :: SomeException) -> pure ()) $ withCleanup $ \cleanup -> do _ <- register cleanup $ modifyIORef x (+1) release undefined (=== 1) =<< readIORef x do -- early release x <- newIORef (0 :: Int) withCleanup $ \cleanup -> do undo <- register cleanup $ modifyIORef x (+1) release undo (=== 1) =<< readIORef x (=== 1) =<< readIORef x do -- unprotect keeps resource from being cleared x <- newIORef (0 :: Int) _ <- withCleanup $ \cleanup -> do key <- register cleanup $ writeIORef x 1 unprotect key (=== 0) =<< readIORef x do -- cleanup actions are masked https://github.com/snoyberg/conduit/issues/144 let checkMasked name = do ms <- getMaskingState unless (ms == MaskedUninterruptible) $ error $ show (name, ms) withCleanup $ \cleanup -> do register cleanup (checkMasked "release") >>= release register cleanup (checkMasked "normal") Left Dummy <- try $ withCleanup $ \cleanup -> do register cleanup (checkMasked "exception") throwIO Dummy pure () data Dummy = Dummy deriving (Show, Typeable) instance Exception Dummy shake-0.19.8/src/Test/CloseFileHandles.hs0000644000000000000000000000467307346545000016315 0ustar0000000000000000{-# LANGUAGE CPP #-} module Test.CloseFileHandles(main) where import Test.Type #ifdef mingw32_HOST_OS main = testNone -- don't know how to do this on windows #else import Development.Shake import Development.Shake.FilePath import System.Posix.IO import Control.Monad.Extra import System.Exit import System.IO main = testBuild test $ do let helper = toNative $ "helper/close_file_handles_helper" <.> exe let name !> test = do want [name] name ~> do need ["helper/close_file_handles_helper" <.> exe]; test let helper_source = unlines ["import System.Environment" ,"import System.Posix.IO" ,"import System.IO" ,"import System.Exit" ,"" ,"main = do" ," args <- getArgs" ," case args of" ," [fdString] -> do" ," handle <- fdToHandle (read fdString)" ," hClose handle" ," exitSuccess" ," _ -> do" ," progName <- getProgName" ," hPutStrLn stderr (\"usage: \" ++ progName ++ \" \\n tries closing the file descriptor number\\n exits successful, if the file descriptor was open\")" ," exitWith (ExitFailure 3)"] "close_file_handles_helper.hs" %> \out -> do need ["../../src/Test/CloseFileHandles.hs"] writeFileChanged out helper_source ["helper/close_file_handles_helper"<.>exe, "close_file_handles_helper.hi", "close_file_handles_helper.o"] &%> \_ -> do need ["close_file_handles_helper.hs"] cmd "ghc --make" "close_file_handles_helper.hs -o helper/close_file_handles_helper" let callWithOpenFile cmdWithOpts = withTempFile $ \file -> actionBracket (openFile file AppendMode) hClose $ \h -> do fd <- liftIO $ handleToFd h (Exit c, Stdout _, Stderr _) <- cmdWithOpts helper (show fd) :: Action (Exit, Stdout String, Stderr String) pure c "defaultbehaviour" !> do c <- callWithOpenFile cmd liftIO $ assertBool (c == ExitSuccess) "handle closed without option CloseFileHandles" "closing" !> do c <- callWithOpenFile (cmd CloseFileHandles) liftIO $ assertBool (c /= ExitSuccess) "handle not closed with option CloseFileHandles" test build = do whenM hasTracker $ build ["-j4", "--no-lint"] build ["-j4"] #endif shake-0.19.8/src/Test/Command.hs0000644000000000000000000001752607346545000014530 0ustar0000000000000000{-# LANGUAGE Rank2Types, ScopedTypeVariables #-} module Test.Command(main) where import Development.Shake import Development.Shake.FilePath import Control.Exception.Extra import System.Time.Extra import General.Extra import Control.Monad.Extra import System.Directory import Test.Type import System.Exit import System.Process import Data.Tuple.Extra import Data.List.Extra import Control.Monad.IO.Class import System.Info.Extra import qualified Data.ByteString.Char8 as BS import qualified Data.ByteString.Lazy.Char8 as LBS main = testBuild test $ do -- shake_helper must be in a subdirectory so we can test placing that subdir on the $PATH let helper = toNative $ "helper/shake_helper" <.> exe let name !> test = do want [name] name ~> do need ["helper/shake_helper" <.> exe]; test let helper_source = unlines ["import System.Process" ,"import Control.Monad" ,"import Control.Concurrent" ,"import System.Directory" ,"import System.Environment" ,"import System.Exit" ,"import System.IO" ,"import qualified Data.ByteString.Lazy.Char8 as LBS" ,"main = do" ," args <- getArgs" ," exe <- getExecutablePath" ," forM_ args $ \\(a:rg) -> do" ," case a of" ," 'o' -> putStrLn rg" ," 'e' -> hPutStrLn stderr rg" ," 'x' -> exitFailure" ," 'c' -> putStrLn =<< getCurrentDirectory" ," 'v' -> putStrLn =<< getEnv rg" ," 'w' -> threadDelay $ ceiling $ (read rg :: Double) * 1000000" ," 'r' -> LBS.putStr $ LBS.replicate (read rg) 'x'" ," 'i' -> putStr =<< getContents" ," 's' -> void $ readProcess exe [rg] \"\"" ," hFlush stdout" ," hFlush stderr" ] "shake_helper.hs" %> \out -> do need ["../../src/Test/Command.hs"] writeFileChanged out helper_source ["helper/shake_helper" <.> exe, "shake_helper.o", "shake_helper.hi"] &%> \_ -> do need ["shake_helper.hs"] cmd "ghc --make" "shake_helper.hs -o helper/shake_helper" "capture" !> do (Stderr err, Stdout out) <- cmd helper ["ostuff goes here","eother stuff here"] liftIO $ out === "stuff goes here\n" liftIO $ err === "other stuff here\n" liftIO $ waits $ \w -> do Stdouterr out <- cmd helper Shell ["o1",w,"e2",w,"o3"] out === "1\n2\n3\n" "failure" !> do (Exit e, Stdout (), Stderr ()) <- cmd helper "oo ee x" when (e == ExitSuccess) $ error "/= ExitSuccess" liftIO $ assertException ["BAD"] $ cmd_ helper "oo eBAD x" (EchoStdout False) (EchoStderr False) liftIO $ assertException ["MORE"] $ cmd_ helper "oMORE eBAD x" (WithStdout True) (WithStderr False) (EchoStdout False) (EchoStderr False) "throws" ~> cmd Shell "not_a_process foo" "cwd" !> do -- FIXME: Linux searches the Cwd argument for the file, Windows searches getCurrentDirectory helper <- liftIO $ canonicalizePath $ "helper/shake_helper" <.> exe liftIO $ createDirectoryRecursive "helper/tests" Stdout out <- cmd (Cwd "helper/tests") (Cwd "..") helper "c" let norm = fmap dropTrailingPathSeparator . canonicalizePath . trim liftIO $ join $ liftM2 (===) (norm out) (norm "helper") let checkTimeout act = do offset <- liftIO offsetTime act t <- liftIO offset putInfo $ "Timed out in " ++ showDuration t when (t < 2 || t > 8) $ error $ "failed to timeout, took " ++ show t "timeout1" !> checkTimeout (do Exit exit <- cmd (Timeout 2) helper "w20" liftIO $ assertBool (exit /= ExitSuccess) "exit was ExitSuccess") "timeout2" !> do checkTimeout $ liftIO $ timeout 2 $ cmd_ helper "w20" -- disabled on Windows because when you abort a Shell process you get a -- "Do you want to terminate the batch file (Y/N)" unless isWindows $ "timeout3" !> do checkTimeout $ liftIO $ timeout 2 $ cmd_ Shell helper "w20" "timeout4" !> do checkTimeout $ liftIO $ timeout 2 $ cmd_ helper "sw20" "timeout5" !> do checkTimeout $ liftIO $ timeout 2 $ cmd_ helper "i w20" "env" !> do -- use liftIO since it blows away PATH which makes lint-tracker stop working Stdout out <- liftIO $ cmd (Env [("FOO","HELLO SHAKE")]) Shell helper "vFOO" liftIO $ out === "HELLO SHAKE\n" StdoutTrim out <- cmd (AddEnv "FOO" "GOODBYE SHAKE") Shell helper "vFOO" liftIO $ out === "GOODBYE SHAKE" "space" !> do Stdout out <- cmd helper ["oSPACE 1"] liftIO $ out === "SPACE 1\n" Stdout out <- cmd Shell helper "\"oSPACE 2\"" liftIO $ out === "SPACE 2\n" whenM (liftIO hasTracker) $ do Stdout out <- cmd Shell AutoDeps helper "\"oSPACE 2\"" liftIO $ out === "SPACE 2\n" (Stdout (), CmdLine x) <- cmd helper ["oSPACE 3","oDIRECT"] unless (" \"oSPACE 3\" oDIRECT" `isSuffixOf` replace "\'" "\"" x) $ fail $ "Invalid CmdLine, " ++ x "path" !> do let path = AddPath [dropTrailingPathSeparator "helper"] [] cmd_ "helper/shake_helper" cmd_ $ "helper/shake_helper" <.> exe cmd_ path Shell "shake_helper" cmd_ path "shake_helper" "file" !> do let file = "file.txt" cmd_ helper (FileStdout file) (FileStderr file) (EchoStdout False) (EchoStderr False) (WithStderr False) "ofoo ebar obaz" liftIO $ assertContents file "foo\nbar\nbaz\n" liftIO $ waits $ \w -> do Stderr err <- cmd helper (FileStdout file) (FileStderr file) ["ofoo",w,"ebar",w,"obaz"] err === "bar\n" assertContents file "foo\nbar\nbaz\n" "timer" !> do timer $ cmd helper "binary" !> do (Stdout str, Stdout bs) <- cmd BinaryPipes helper "ofoo" liftIO $ (===) (str, bs) $ second BS.pack $ dupe $ if isWindows then "foo\r\n" else "foo\n" (Stdout str, Stdout bs) <- cmd helper "ofoo" liftIO $ (str, bs) === ("foo\n", BS.pack $ if isWindows then "foo\r\n" else "foo\n") pure () "large" !> do (Stdout (_ :: String), CmdTime t1) <- cmd helper "r10000000" (Stdout (_ :: LBS.ByteString), CmdTime t2) <- cmd helper "r10000000" t3 <- withTempFile $ \file -> fromCmdTime <$> cmd helper "r10000000" (FileStdout file) liftIO $ putStrLn $ "Capturing 10Mb takes: " ++ intercalate "," [s ++ " = " ++ showDuration d | (s,d) <- [("String",t1),("ByteString",t2),("File",t3)]] "stdin" !> do withTempFile $ \file -> do liftIO $ writeFile file " " Stdout (x :: String) <- cmd helper "i" (Stdin "hello") (FileStdin file) (StdinBS $ LBS.pack "world") liftIO $ x === "hello world" "async" !> do let file = "async.txt" pid <- cmd helper (FileStdout file) "w2" "ohello" Nothing <- liftIO $ getProcessExitCode pid ExitSuccess <- liftIO $ waitForProcess pid liftIO $ assertContents file "hello\n" test build = do -- reduce the overhead by running all the tests in parallel -- lint can make a big different to the command lines, so test with and without whenM hasTracker $ build ["-j4","--no-lint"] build ["-j4"] assertException ["not_a_process foo"] (build ["throws","--quiet"]) timer :: (CmdResult r, MonadIO m) => (forall r . CmdResult r => m r) -> m r timer act = do (CmdTime t, CmdLine x, r) <- act liftIO $ putStrLn $ "Command " ++ x ++ " took " ++ show t ++ " seconds" pure r waits :: (String -> IO ()) -> IO () waits op = f 0 where f w | w > 1 = op "w10" | otherwise = catch_ (op $ "w" ++ show w) $ const $ f $ w + 0.1 shake-0.19.8/src/Test/Config.hs0000644000000000000000000000400707346545000014345 0ustar0000000000000000 module Test.Config(main) where import Development.Shake import Development.Shake.FilePath import Development.Shake.Config import Test.Type import Data.List.Extra import qualified Data.HashMap.Strict as Map import Data.Maybe main = testBuild test $ do want ["hsflags.var","cflags.var","none.var","keys"] usingConfigFile "config" "*.var" %> \out -> do cfg <- getConfig $ upper $ takeBaseName out liftIO $ appendFile (out -<.> "times") "X" writeFile' out $ fromMaybe "" cfg "keys" %> \out -> do liftIO $ appendFile "keys.times" "X" liftIO . writeFile out . unwords =<< getConfigKeys test build = do build ["clean"] writeFile "config" $ unlines ["HEADERS_DIR = /path/to/dir" ,"CFLAGS = -O2 -I${HEADERS_DIR} -g" ,"HSFLAGS = -O2"] build [] assertContents "cflags.var" "-O2 -I/path/to/dir -g" assertContents "hsflags.var" "-O2" assertContents "none.var" "" assertContents "keys" "CFLAGS HEADERS_DIR HSFLAGS" appendFile "config" $ unlines ["CFLAGS = $CFLAGS -w"] build [] assertContents "cflags.var" "-O2 -I/path/to/dir -g -w" assertContents "hsflags.var" "-O2" assertContents "cflags.times" "XX" assertContents "hsflags.times" "X" assertContents "keys.times" "X" -- Test readConfigFileWithEnv writeFile "config" $ unlines ["HEADERS_DIR = ${SOURCE_DIR}/path/to/dir" ,"CFLAGS = -O2 -I${HEADERS_DIR} -g"] vars <- readConfigFileWithEnv [("SOURCE_DIR", "/path/to/src")] "config" assertBool (Map.lookup "HEADERS_DIR" vars == Just "/path/to/src/path/to/dir") $ "readConfigFileWithEnv:" ++ " Expected: " ++ show (Just "/path/to/src/path/to/dir") ++ " Got: " ++ show (Map.lookup "HEADERS_DIR" vars) assertBool (Map.lookup "CFLAGS" vars == Just "-O2 -I/path/to/src/path/to/dir -g") $ "readConfigFileWithEnv:" ++ " Expected: " ++ show (Just "-O2 -I/path/to/src/path/to/dir -g") ++ " Got: " ++ show (Map.lookup "CFLAGS" vars) shake-0.19.8/src/Test/Database.hs0000644000000000000000000000667307346545000014657 0ustar0000000000000000 module Test.Database(main) where import Control.Concurrent.Extra import Control.Exception.Extra import Control.Monad import Data.List import Data.IORef import Development.Shake import Development.Shake.Database import Development.Shake.FilePath import System.Time.Extra import System.Directory as IO import Test.Type rules = do "*.out" %> \out -> do liftIO $ appendFile "log.txt" "x" copyFile' (out -<.> "in") out removeFilesAfter "." ["log.txt"] "*.err" %> \out -> fail out phony "sleep" $ liftIO $ sleep 20 main = testSimple $ do let opts = shakeOptions{shakeFiles="/dev/null"} writeFile "a.in" "a" writeFile "b.in" "b" sleepFileTime writeFile "log.txt" "" (open, close) <- shakeOpenDatabase opts rules db <- open ([12], after) <- shakeRunDatabase db [need ["a.out"] >> pure 12] assertContents "log.txt" "x" writeFile "a.in" "A" shakeRunDatabase db [need ["a.out","b.out"]] assertContents "a.out" "A" assertContents "log.txt" "xxx" ([13,14], _) <- shakeRunDatabase db [need ["a.out"] >> pure 13, pure 14] assertContents "log.txt" "xxx" live <- shakeLiveFilesDatabase db sort live === ["a.in","a.out"] shakeProfileDatabase db "-" -- check that parallel runs blow up, and that we can throw async exceptions to kill it assertWithin 10 $ do threads <- newBarrier results <- replicateM 2 newBarrier ts <- forM results $ \result -> forkFinally (void $ shakeRunDatabase db [need ["sleep"]]) $ \r -> mask_ $ do print $ "Failed with " ++ show r signalBarrier result r threads <- waitBarrier threads me <- myThreadId forM_ threads $ \t -> when (t /= me) $ throwTo t $ ErrorCall "ab123c" signalBarrier threads ts results <- show <$> mapM waitBarrier results assertBool ("ab123c" `isInfixOf` results) "Contains ab123c" assertBool ("currently running" `isInfixOf` results) "Contains 'currently using'" close assertException ["already closed"] $ shakeRunDatabase db [] shakeRunAfter opts after assertBoolIO (not <$> IO.doesFileExist "log.txt") "Log must be deleted" errs <- shakeWithDatabase opts{shakeStaunch=True, shakeVerbosity=Silent} rules $ \db -> do assertException ["Error when running"] $ shakeRunDatabase db [need ["foo.err","bar.err"]] shakeErrorsDatabase db sort (map fst errs) === ["bar.err","foo.err"] -- check the progress thread gets killed properly on normal cleanup ref <- newIORef 0 opts <- pure opts{shakeProgress = const $ bracket_ (modifyIORef ref succ) (modifyIORef ref succ) $ sleep 100} (open, close) <- shakeOpenDatabase opts rules db <- open ([12], after) <- shakeRunDatabase db [need ["a.out"] >> liftIO (modifyIORef ref succ) >> pure 12] (=== 3) =<< readIORef ref -- success if it all shuts down cleanly -- and on an exception writeIORef ref 0 assertException ["terminate"] $ shakeRunDatabase db [liftIO (modifyIORef ref succ) >> fail "terminate"] (=== 3) =<< readIORef ref -- and on an external exception writeIORef ref 0 bar <- newBarrier; bar2 <- newBarrier t <- flip forkFinally (signalBarrier bar2) $ void $ shakeRunDatabase db $ pure $ do liftIO $ modifyIORef ref succ liftIO $ signalBarrier bar () need ["sleep"] waitBarrier bar sleep 0.1 killThread t waitBarrier bar2 (=== 3) =<< readIORef ref shake-0.19.8/src/Test/Digest.hs0000644000000000000000000000533507346545000014364 0ustar0000000000000000 module Test.Digest(main) where import Control.Monad import Development.Shake import Test.Type main = testBuild test $ do want ["Out.txt","Out2.txt"] "Out.txt" %> \out -> do txt <- readFile' "In.txt" liftIO $ appendFile out txt ["Out1.txt","Out2.txt"] &%> \[out1,out2] -> do txt <- readFile' "In.txt" liftIO $ appendFile out1 txt liftIO $ appendFile out2 txt ["Bug1.txt","Bug2.txt"] &%> \[out1,out2] -> do need ["Bug3.txt"] writeFile' out1 "X" writeFile' out2 "Y" "leaf" ~> pure () "node1.txt" %> \file -> do need ["leaf"]; writeFile' file "x" "node2.txt" %> \file -> do need ["node1.txt"]; liftIO $ appendFile file "x" ["rewrite1","rewrite2"] &%> \outs -> do alwaysRerun forM_ outs $ \out -> writeFile' out "rewrite" test build = do let outs = ["Out.txt","Out1.txt","Out2.txt"] let writeOut x = forM_ outs $ \out -> writeFile out x let writeIn = writeFile "In.txt" let assertOut x = forM_ outs $ \out -> assertContents out x writeOut "" writeIn "X" build ["--sleep","--digest-and"] assertOut "X" -- should not involve a hash calculation (sadly no way to test that) build ["--sleep","--digest-and"] assertOut "X" writeIn "X" build ["--sleep","--digest-and"] assertOut "X" writeIn "X" build ["--sleep","--digest-or"] assertOut "XX" writeIn "X" build ["--sleep","--digest-and"] assertOut "XX" build ["--sleep","--digest-and"] writeOut "XX" build ["--sleep","--digest-and"] assertOut "XX" build ["--sleep","--digest-and"] writeOut "Y" build ["--sleep","--digest-and"] assertOut "YX" writeIn "X" build ["--sleep","--digest"] assertOut "YX" writeIn "Z" build ["--sleep","--digest-and-input"] assertOut "YXZ" build ["--sleep","--digest-and-input"] writeOut "YXZ" build ["--sleep","--digest-and-input"] assertOut "YXZZ" writeIn "Q" build ["--sleep","--digest-and-input"] assertOut "YXZZQ" writeIn "Q" build ["--sleep","--digest-and-input"] assertOut "YXZZQ" -- test for #218 forM_ [("--digest",1),("--digest-and",1),("--digest-or",2),("--digest-and-input",2),("",2)] $ \(flag,count) -> do writeFile "node2.txt" "y" replicateM_ 2 $ build $ ["node2.txt","--sleep"] ++ [flag | flag /= ""] assertContents "node2.txt" $ 'y' : replicate count 'x' -- test for #296 writeFile "Bug3.txt" "X" build ["--digest-and-input","Bug1.txt","--sleep"] writeFile "Bug3.txt" "Y" build ["--digest-and-input","Bug1.txt","--lint"] -- test for #427 build ["rewrite1","--digest-and"] build ["rewrite1","--digest-and","--lint","--sleep"] shake-0.19.8/src/Test/Directory.hs0000644000000000000000000001104607346545000015105 0ustar0000000000000000{-# LANGUAGE TupleSections #-} module Test.Directory(main) where import Development.Shake import Development.Shake.FilePath import Test.Type import Data.List import Data.Function import Control.Monad import General.Extra import System.Directory(createDirectory) import qualified System.Directory as IO import qualified System.IO.Extra as IO -- Use escape characters, _o=* _l=/ __= readEsc ('_':'o':xs) = '*' : readEsc xs readEsc ('_':'l':xs) = '/' : readEsc xs readEsc ('_':'_':xs) = ' ' : readEsc xs readEsc (x:xs) = x : readEsc xs readEsc [] = [] showEsc = concatMap f where f '*' = "_o" f '/' = "_l" f ' ' = "__" f x = [x] main = testBuild test $ do "*.contents" %> \out -> writeFileLines out =<< getDirectoryContents (readEsc $ dropExtension out) "*.dirs" %> \out -> writeFileLines out =<< getDirectoryDirs (readEsc $ dropExtension out) "*.files" %> \out -> do let pats = readEsc $ dropExtension out let (x:xs) = ["" | " " `isPrefixOf` pats] ++ words pats writeFileLines out . map toStandard =<< getDirectoryFiles x xs "*.exist" %> \out -> do let xs = words $ readEsc $ dropExtension out fs <- mapM doesFileExist xs ds <- mapM doesDirectoryExist xs let bool x = if x then "1" else "0" writeFileLines out $ zipWith ((++) `on` bool) fs ds "dots" %> \out -> do b1 <- liftM2 (==) (getDirectoryContents ".") (getDirectoryContents "") b2 <- liftM2 (==) (getDirectoryDirs ".") (getDirectoryDirs "") b3 <- liftM2 (==) (getDirectoryFiles "." ["*.txt"]) (getDirectoryFiles "" ["*.txt"]) b4 <- liftM2 (==) (getDirectoryFiles "." ["C.txt/*.txt"]) (getDirectoryFiles "" ["C.txt/*.txt"]) b5 <- liftM2 (==) (getDirectoryFiles "." ["//*.txt"]) (getDirectoryFiles "" ["//*.txt"]) writeFileLines out $ map show [b1,b2,b3,b4,b5] test build = do let demand x ys = let f = showEsc x in do build [f]; assertContents f $ unlines $ words ys build ["clean"] demand " *.txt.files" "" demand " //*.txt.files" "" demand ".dirs" "" demand "A.txt B.txt C.txt.exist" "00 00 00" writeFile "A.txt" "" writeFile "B.txt" "" createDirectory "C.txt" writeFile "C.txt/D.txt" "" writeFile "C.txt/E.xtx" "" demand " *.txt.files" "A.txt B.txt" demand ".dirs" "C.txt" demand "A.txt B.txt C.txt.exist" "10 10 01" demand " //*.txt.files" "A.txt B.txt C.txt/D.txt" demand "C.txt *.txt.files" "D.txt" demand " *.txt //*.xtx.files" "A.txt B.txt C.txt/E.xtx" demand " C.txt/*.files" "C.txt/D.txt C.txt/E.xtx" demand " missing_dir/*.files" "" demand " missing_dir/bar/*.files" "" demand " //missing_dir/*.files" "" assertException ["missing_dir","does not exist"] $ build ["--quiet",showEsc "missing_dir *.files"] build ["dots","--no-lint"] assertContents "dots" $ unlines $ words "True True True True True" let removeTest pat del keep = IO.withTempDir $ \dir -> do forM_ (del ++ keep) $ \s -> do createDirectoryRecursive $ dir takeDirectory s unless (hasTrailingPathSeparator s) $ writeFile (dir s) "" removeFiles dir pat createDirectoryRecursive dir forM_ (map (False,) del ++ map (True,) keep) $ \(b,s) -> do b2 <- (if hasTrailingPathSeparator s then IO.doesDirectoryExist else IO.doesFileExist) $ dir s when (b /= b2) $ do let f b = if b then "present" else "missing" error $ "removeFiles mismatch: with pattern " ++ show pat ++ ", " ++ s ++ " should be " ++ f b ++ " but is " ++ f b2 removeTest ["//bob"] ["test/bob","more/bob"] ["extra/obo"] removeTest ["bob"] ["bob/"] ["bar/"] removeTest ["*.hs"] ["test.hs"] ["extra/more.hs","new.txt"] removeTest ["baz"] ["baz"] ["foo","bar/bob"] removeTest ["baz"] ["baz/bob","baz/"] ["foo","bar/bob"] removeTest ["Foo//*"] ["Foo/bar","Foo/Quux/bar","Foo/Quux/"] [] removeTest ["Foo//"] ["Foo/"] ["bar"] removeTest ["baz"] [] ["test.hs","bar/","foo/"] removeTest ["bob//*"] [] ["test/bob/"] removeTest ["//bob"] ["test/bob/"] ["test/"] removeTest ["//*.txt"] ["more/a.txt"] ["more/"] removeTest ["//*.txt"] ["more/a.txt/"] ["more/"] removeTest ["//*.txt"] ["more/a.txt/","more/b.txt"] ["more/"] removeTest ["//*.txt"] [] ["more/"] removeTest ["a//b"] ["a/c/b"] [] removeFiles "non-existing-directory" ["*"] shake-0.19.8/src/Test/Docs.hs0000644000000000000000000004146307346545000014037 0ustar0000000000000000{-# LANGUAGE PatternGuards, ViewPatterns #-} module Test.Docs(main) where import Development.Shake import Development.Shake.FilePath import qualified System.FilePattern.Directory as IO import System.Directory import Test.Type import Data.Char import Data.List.Extra -- Can't get the paths to work on CI with tracking etc main = testBuild (notCI . defaultTest) $ do let index = "dist/doc/html/shake/index.html" let setup = "dist/setup.exe" let config = "dist/setup-config" want ["Success.txt"] let trackIgnore = trackAllow ["dist/**", "dist-newstyle/**"] let needSource = need =<< getDirectoryFiles "." (map (shakeRoot ) ["src/Development/Shake.hs","src/Development/Shake//*.hs","src/Development/Ninja/*.hs","src/General//*.hs"]) let runSetup :: [String] -> Action () runSetup args = do trackIgnore need [setup] -- Make Cabal and Stack play nicely with GHC_PACKAGE_PATH setup <- liftIO $ canonicalizePath setup cmd_ (RemEnv "GHC_PACKAGE_PATH") (Cwd shakeRoot) setup args setup %> \_ -> do -- Important to compile the setup binary, or we run foul of -- https://gitlab.haskell.org/ghc/ghc/issues/17575 trackIgnore need [shakeRoot "Setup.hs"] setup <- liftIO $ canonicalizePath setup curdir <- liftIO $ canonicalizePath "dist" cmd_ (Cwd shakeRoot) "ghc -package=Cabal Setup.hs -o" [setup] "-outputdir" [curdir] config %> \_ -> do path <- getEnv "GHC_PACKAGE_PATH" dist <- liftIO $ canonicalizePath "dist" -- make sure it works even if we cwd need [shakeRoot "shake.cabal"] runSetup $ ["configure","--builddir=" ++ dist,"--user"] ++ -- package-db is very sensitive, see #267 -- note that the reverse ensures the behaviour is consistent between the flags and the env variable ["--package-db=" ++ x | x <- maybe [] (reverse . filter (`notElem` [".",""]) . splitSearchPath) path] -- Paths_shake is only created by "Setup build" (which we want to skip), and required by "Setup haddock", so we fake it copyFile' (shakeRoot "src/Paths.hs") "dist/build/autogen/Paths_shake.hs" copyFile' (shakeRoot "src/Paths.hs") "dist/build/shake/autogen/Paths_shake.hs" writeFile' "dist/build/autogen/cabal_macros.h" "" writeFile' "dist/build/shake/autogen/cabal_macros.h" "" index %> \_ -> do need $ config : map (shakeRoot ) ["shake.cabal","Setup.hs","README.md","CHANGES.txt","docs/Manual.md","docs/shake-progress.png"] needSource trackIgnore dist <- liftIO $ canonicalizePath "dist" runSetup ["haddock", "--builddir=" ++ dist] "Part_*.hs" %> \out -> do need [shakeRoot "src/Test/Docs.hs"] -- so much of the generator is in this module let noR = filter (/= '\r') src <- if "_md" `isSuffixOf` takeBaseName out then fmap (findCodeMarkdown . lines . checkBlacklist . noR) $ readFile' $ shakeRoot "docs/" ++ drop 5 (reverse (drop 3 $ reverse $ takeBaseName out)) ++ ".md" else fmap (findCodeHaddock . checkBlacklist . noR) $ readFile' $ "dist/doc/html/shake/" ++ replace "_" "-" (drop 5 $ takeBaseName out) ++ ".html" let (imports,rest) = partition ("import " `isPrefixOf`) $ showCode src writeFileChanged out $ unlines $ ["{-# LANGUAGE DeriveDataTypeable, RankNTypes, ExtendedDefaultRules, GeneralizedNewtypeDeriving #-}" ,"{-# LANGUAGE NoMonomorphismRestriction, ScopedTypeVariables, ConstraintKinds, FlexibleContexts, TypeFamilies #-}" ,"{-# OPTIONS_GHC -w #-}" ,"module " ++ takeBaseName out ++ "() where" ,"import Control.Applicative" ,"import Control.Concurrent" ,"import Control.Exception" ,"import Control.Monad" ,"import Control.Monad.Trans.Reader" ,"import Data.ByteString(ByteString, pack, unpack)" ,"import qualified Data.ByteString.Char8 as BS" ,"import qualified System.Directory.Extra as IO" ,"import qualified System.IO.Extra as IO" ,"import Data.Char" ,"import Data.Data" ,"import Data.Dynamic" ,"import Data.List.Extra" ,"import System.Time.Extra" ,"import Data.Maybe" ,"import Data.Monoid" ,"import Development.Shake hiding ((*>))" ,"import Development.Shake.Command" ,"import Development.Shake.Classes" ,"import Development.Shake.Database" ,"import Development.Shake.Rule" ,"import Development.Shake.Util" ,"import Development.Shake.FilePath" ,"import System.Console.GetOpt" ,"import System.Directory(setCurrentDirectory, withCurrentDirectory)" ,"import qualified System.Directory" ,"import System.Environment(withArgs, lookupEnv, getEnvironment)" ,"import System.Process" ,"import System.Exit" ,"import Control.Applicative" ,"import Control.Monad.IO.Class" ,"import Control.Monad.Fail" ,"import System.IO hiding (readFile')"] ++ ["import " ++ replace "_" "." (drop 5 $ takeBaseName out) | not $ "_md.hs" `isSuffixOf` out] ++ imports ++ ["(==>) :: Bool -> Bool -> Bool" ,"(==>) = undefined" ,"(<==) = ()" ,"infix 1 ==>" ,"infix 0 ===" ,"(===) :: a -> a -> b" ,"(===) = undefined" ,"forAll f = f undefined" ,"remaining = 1.1" ,"done = 1.1" ,"time_elapsed = 1.1" ,"old = \"\"" ,"new = \"\"" ,"myvar = \"\"" ,"myfile = \"\"" ,"inputs = [\"\"]" ,"files = [\"\"]" ,"input = \"\"" ,"output = \"\"" ,"opts = shakeOptions" ,"result = undefined :: IO (Maybe (Rules ()))" ,"launchMissiles = undefined :: Bool -> IO ()" ,"myVariable = ()" ,"instance Eq (OptDescr a)" ,"(foo,bar,baz) = undefined" ,"(p1,p2) = (0.0, 0.0)" ,"(r1,r2) = (pure () :: Rules(), pure () :: Rules())" ,"xs = []" ,"ys = []" ,"os = [\"file.o\"]" ,"out = \"\"" ,"str1 = \"\"" ,"str2 = \"\"" ,"def = undefined" ,"var = undefined" ,"newValue = undefined" ,"newStore = BS.empty" ,"change = ChangedNothing" ,"str = \"\""] ++ rest "Files.lst" %> \out -> do need [shakeRoot "src/Test/Docs.hs"] -- so much of the generator is in this module need [index] filesHs <- liftIO $ IO.getDirectoryFiles "dist/doc/html/shake" ["Development-*.html"] -- filesMd on Travis will only include Manual.md, since it's the only one listed in the .cabal -- On AppVeyor, where we build from source, it will check the rest of the website filesMd <- getDirectoryFiles (shakeRoot "docs") ["*.md"] writeFileChanged out $ unlines $ ["Part_" ++ replace "-" "_" (takeBaseName x) | x <- filesHs, not $ any (`isSuffixOf` x) ["-Classes.html", "-FilePath.html"]] ++ ["Part_" ++ takeBaseName x ++ "_md" | x <- filesMd, takeBaseName x `notElem` ["Developing","Model","Architecture"]] let needModules = do mods <- readFileLines "Files.lst"; need [m <.> "hs" | m <- mods]; pure mods "Main.hs" %> \out -> do mods <- needModules writeFileLines out $ ["module Main(main) where"] ++ ["import " ++ m ++ "()" | m <- mods] ++ ["main = pure ()"] "Success.txt" %> \out -> do putInfo . ("Checking documentation for:\n" ++) =<< readFile' "Files.lst" needModules need ["Main.hs"] trackIgnore needSource cmd_ "ghc -fno-code -ignore-package=hashmap" ["-idist/build/autogen","-i" ++ shakeRoot "src","Main.hs"] writeFile' out "" checkBlacklist :: String -> String checkBlacklist xs = if null bad then xs else error $ show ("Blacklist", bad) where bad = [(w, x) | x <- map lower $ lines xs, w <- blacklist, w `isInfixOf` x] --------------------------------------------------------------------- -- FIND THE CODE newtype Code = Code [String] deriving (Show,Eq,Ord) findCodeHaddock :: String -> [Code] findCodeHaddock src = [ Code $ unindent $ lines $ innerText x | tag <- ["code","pre"] , x <- insideTag tag src , let bad = nubOrd (insideTag "em" x) \\ italics , if null bad then True else error $ "Bad italics, " ++ show bad ] findCodeMarkdown :: [String] -> [Code] findCodeMarkdown (x:xs) | indented x && not (isBlank x) = let (a,b) = span (\x -> indented x || isBlank x) (x:xs) in Code (dropWhileEnd isBlank $ unindent a) : findCodeMarkdown b where indented x = length (takeWhile isSpace x) >= 4 findCodeMarkdown (x:xs) = map (Code . pure) (evens $ splitOn "`" x) ++ findCodeMarkdown xs where evens (_:x:xs) = x : evens xs evens _ = [] findCodeMarkdown [] = [] --------------------------------------------------------------------- -- RENDER THE CODE showCode :: [Code] -> [String] showCode = concat . zipWithFrom f 1 . nubOrd where f i (Code x) | "#" `isPrefixOf` concat x = [] | all whitelist x = [] | otherwise = showStmt i $ filter (not . isBlank . dropComment) $ fixCmd $ map undefDots x fixCmd :: [String] -> [String] fixCmd xs | all ("cmd_ " `isPrefixOf`) xs = xs ++ ["pure () :: IO () "] | otherwise = map (replace "Stdout out" "Stdout (out :: String)" . replace "Stderr err" "Stderr (err :: String)") xs -- | Replace ... with undefined (don't use undefined with cmd; two ...'s should become one replacement) undefDots :: String -> String undefDots x | Just x <- stripSuffix "..." x, Just (x,_) <- stripInfix "..." x = x ++ new | otherwise = replace "..." new x where new = if words x `disjoint` ["cmd","cmd_","Development.Shake.cmd","Development.Shake.cmd_"] then "undefined" else "[\"\"]" showStmt :: Int -> [String] -> [String] showStmt _ [] = [] showStmt i xs | isDecl $ unlines xs = map f xs where f x = if fst (word1 x) `elem` dupes then "_" ++ show i ++ "_" ++ x else x showStmt i [x] | filter isAlpha (fst $ word1 x) `elem` types = ["type Code_" ++ show i ++ " = " ++ x] showStmt i [x] | length (words x) <= 2 = ["code_" ++ show i ++ " = (" ++ x ++ ")"] -- deal with operators and sections showStmt i xs | all isPredicate xs, length xs > 1 = zipWithFrom (\j x -> "code_" ++ show i ++ "_" ++ show j ++ " = " ++ x) 1 xs showStmt i xs = ("code_" ++ show i ++ " = do") : map (" " ++) xs ++ [" undefined" | isBindStmt $ last xs] isPredicate :: String -> Bool isPredicate x = not $ disjoint (words x) ["==","?=="] isBindStmt :: String -> Bool isBindStmt x = "let " `isPrefixOf` x || " <- " `isInfixOf` x isDecl :: String -> Bool isDecl x | fst (word1 x) `elem` ["import","infix","instance","newtype","data"] = True isDecl (words -> name:"::":_) | all isAlphaNum name = True -- foo :: Type Signature isDecl x | "=" `elem` takeWhile (`notElem` ["let","where"]) (words $ takeWhile (/= '{') x) = True -- foo arg1 arg2 = an implementation isDecl _ = False --------------------------------------------------------------------- -- TEXT MANIPULATION -- | Is a string empty or whitespace isBlank :: String -> Bool isBlank = all isSpace -- | If all lines are indented by at least n spaces, then trim n spaces from each line unindent :: [String] -> [String] unindent xs = map (drop n) xs where n = minimum $ 1000 : map (length . takeWhile (== ' ')) (filter (not . isBlank) xs) -- | Remove line comments from the end of lines dropComment :: String -> String dropComment = fst . breakOn "--" -- | Find all pieces of text inside a given tag insideTag :: String -> String -> [String] insideTag tag = map (fst . breakOn ("")) . drop1 . splitOn ("<" ++ tag ++ ">") -- | Given some HTML, find the raw text innerText :: String -> String innerText ('<':xs) = innerText $ drop1 $ dropWhile (/= '>') xs innerText ('&':xs) | Just xs <- stripPrefix "quot;" xs = '\"' : innerText xs | Just xs <- stripPrefix "lt;" xs = '<' : innerText xs | Just xs <- stripPrefix "gt;" xs = '>' : innerText xs | Just xs <- stripPrefix "amp;" xs = '&' : innerText xs innerText (x:xs) = x : innerText xs innerText [] = [] --------------------------------------------------------------------- -- DATA SECTION -- | Only the following identifiers can appear in italic code blocks in Haddock -- (otherwise it's a common markup mistake) italics :: [String] italics = words "command-name file-name N" -- | Identifiers that indicate the fragment is a type types :: [String] types = words $ "MVar IO String FilePath Maybe [String] FSATrace Char ExitCode ReaderT Change " ++ "Action Resource Rebuild FilePattern Development.Shake.FilePattern " ++ "Lint Verbosity Rules CmdOption Int Double " ++ "NFData Binary Hashable Eq Typeable Show Applicative " ++ "CmdResult ByteString ProcessHandle Rule Monad MonadFail Monoid Data TypeRep " ++ "BuiltinRun BuiltinLint BuiltinCheck ShakeDatabase" -- | Duplicated identifiers which require renaming dupes :: [String] dupes = words "main progressSimple rules" isFilePath :: String -> Bool isFilePath x = "C:\\" `isPrefixOf` x || (all validChar x && ("foo/" `isPrefixOf` x || takeExtension x `elem` exts)) where validChar x = isAlphaNum x || x `elem` "_./*" exts = words $ ".txt .hi .hs .o .exe .tar .cpp .cfg .dep .out .deps .m .h .c .html .zip " ++ ".js .json .trace .database .src .sh .bat .ninja .rot13 .version .digits .prof .md" isCmdFlag :: String -> Bool isCmdFlag "+RTS" = True isCmdFlag x = length a `elem` [1,2] && all (\x -> isAlphaNum x || x `elem` "-=/_[]") b where (a,b) = span (== '-') x isCmdFlags :: String -> Bool isCmdFlags = all (\x -> let y = dropSuffix "," x in isCmdFlag y || isArg y) . words where isArg = all (\x -> isUpper x || x == '=') isEnvVar :: String -> Bool isEnvVar x | Just x <- stripPrefix "$" x = all validChar x | Just x <- stripPrefix "%" x, Just x <- stripSuffix "%" x = all validChar x | otherwise = False where validChar x = isAlpha x || x == '_' isProgram :: String -> Bool isProgram (words -> x:xs) = x `elem` programs && all (\x -> isCmdFlag x || isFilePath x || all isAlpha x || x == "&&") xs where programs = words "excel gcc docker cl make ghc ghci cabal distcc npm build tar git fsatrace ninja touch pwd runhaskell rot13 main shake stack rm cat sed sh apt-get build-multiple" isProgram _ = False -- | Should a fragment be whitelisted and not checked whitelist :: String -> Bool whitelist x | null x || isFilePath x || isCmdFlags x || isEnvVar x || isProgram x = True whitelist x | elem x $ words $ "newtype do a q m c x value key os contents clean _make " ++ ".. /. // \\ //* dir/*/* dir [ " ++ "ConstraintKinds TemplateHaskell ApplicativeDo OverloadedLists OverloadedStrings GeneralizedNewtypeDeriving DeriveDataTypeable TypeFamilies SetConsoleTitle " ++ "Data.List System.Directory Development.Shake.FilePath run " ++ "NoProgress Error src about://tracing " ++ ".make/i586-linux-gcc/output build " ++ "/usr/special /usr/special/userbinary " ++ "Hidden extension xterm main opts result flagValues argValues fail " ++ "HEADERS_DIR /path/to/dir CFLAGS let linkFlags temp code out err " ++ "_shake _shake/build manual chrome://tracing/ compdb " ++ "docs/manual foo.* _build _build/run depfile 0.000s " ++ "@ndm_haskell file-name .PHONY filepath trim base stack extra #include " ++ "*> BuiltinRun BuiltinLint BuiltinIdentity RuleResult " ++ "oldStore mode node_modules llbuild Makefile " ++ "RebuildNever RLIMIT_NOFILE " = True whitelist x = x `elem` ["[Foo.hi, Foo.o]" ,"shake-progress" ,"type instance" ,"1m25s (15%)" ,"3m12s (82%)" ,"getPkgVersion $ GhcPkgVersion \"shake\"" ,"ghc --make MyBuildSystem -threaded -rtsopts \"-with-rtsopts=-I0 -qg\"" ,"# command-name (for file-name)" ,"build rules" ,"actions" ,"shakeFiles=\"_build\"" ,"#include \"" ,"pattern %> actions = (pattern ?==) ?> actions" -- because it overlaps ,"buildDir = \"_build\"" ,"#!/bin/sh" ,"shake-build-system" ,"\"_build\" x -<.> \"o\"" ,"[item1,item2,item2]" ,"$(LitE . StringL . loc_filename <$> location)" ,"-d[ FILE], --debug[=FILE]" ,"-r[ FILE], --report[=FILE], --profile[=FILE]" ,"man 2 getrlimit" ] blacklist :: [String] blacklist = -- from https://twitter.com/jesstelford/status/992756386160234497 ["obviously" ,"basically" ,"simply" ,"of course" ,"clearly" ,"everyone knows" -- ,"however" -- ,"so," -- ,"easy" ] shake-0.19.8/src/Test/Errors.hs0000644000000000000000000002456407346545000014426 0ustar0000000000000000{-# LANGUAGE TypeFamilies, GeneralizedNewtypeDeriving, DeriveDataTypeable, ScopedTypeVariables #-} module Test.Errors(main) where import Development.Shake import Development.Shake.Classes import Development.Shake.FilePath import Test.Type import Data.List.Extra import Control.Monad import Control.Concurrent.Extra import General.GetOpt import General.Extra import Data.IORef import Control.Exception.Extra import System.Directory as IO import System.Time.Extra import qualified System.IO.Extra as IO data Args = Die deriving (Eq,Enum,Bounded,Show) newtype BadBinary = BadBinary String deriving (NFData,Show,Eq,Hashable,Typeable) type instance RuleResult BadBinary = BadBinary instance Binary BadBinary where put (BadBinary x) = put x get = do x <- get; if x == "bad" then error "get: BadBinary \"bad\"" else pure $ BadBinary x main = testBuildArgs test optionsEnum $ \args -> do "norule" %> \_ -> need ["norule_isavailable"] "failcreate" %> \_ -> pure () ["failcreates", "failcreates2"] &%> \_ -> writeFile' "failcreates" "" "recursive_" %> \_ -> need ["intermediate_"] "intermediate_" %> \_ -> need ["recursive_"] "rec1" %> \_ -> need ["rec2"] "rec2" %> \_ -> need ["rec1"] "systemcmd" %> \_ -> cmd "random_missing_command" "stack1" %> \_ -> need ["stack2"] "stack2" %> \_ -> need ["stack3"] "stack3" %> \_ -> error "crash" "staunch1" %> \out -> do liftIO $ sleep 0.1 writeFile' out "test" "staunch2" %> \_ -> error "crash" let catcher out op = out %> \out -> do writeFile' out "0" op $ do src <- IO.readFile' out; writeFile out $ show (read src + 1 :: Int) catcher "finally1" $ actionFinally $ fail "die" catcher "finally2" $ actionFinally $ pure () catcher "finally3" $ actionFinally $ liftIO $ sleep 10 catcher "finally4" $ actionFinally $ need ["wait"] "wait" ~> do liftIO $ sleep 10 catcher "exception1" $ actionOnException $ fail "die" catcher "exception2" $ actionOnException $ pure () "retry*" %> \out -> do ref <- liftIO $ newIORef 3 actionRetry (read [last out]) $ liftIO $ do old <- readIORef ref writeIORef ref $ old - 1 if old == 0 then writeFile' out "" else fail "die" res <- newResource "resource_name" 1 "resource" %> \_ -> withResource res 1 $ need ["resource-dep"] "overlap.txt" %> \out -> writeFile' out "overlap.txt" "overlap.t*" %> \out -> writeFile' out "overlap.t*" "overlap.*" %> \out -> writeFile' out "overlap.*" ["*.txx","*.tox"] &%> \_ -> fail "do not run" ["*p.txx"] &%> \_ -> fail "do not run" "chain.2" %> \out -> do src <- readFile' "chain.1" if src == "err" then error "err_chain" else writeFileChanged out src "chain.3" %> \out -> copyFile' "chain.2" out "tempfile" %> \out -> do file <- withTempFile $ \file -> do liftIO $ assertExists file pure file liftIO $ assertMissing file withTempFile $ \file -> do liftIO $ assertExists file writeFile' out file fail "tempfile-died" "tempdir" %> \out -> do file <- withTempDir $ \dir -> do let file = dir "foo.txt" liftIO $ writeFile (dir "foo.txt") "" -- will throw if the directory does not exist writeFile' out "" pure file liftIO $ assertMissing file phony "fail1" $ fail "die1" phony "fail2" $ fail "die2" when (Die `elem` args) $ action $ error "death error" "fresh_dir" %> \out -> liftIO $ createDirectoryRecursive out "need_dir" %> \out -> do liftIO $ createDirectoryRecursive "existing_dir" need ["existing_dir"] writeFile' out "" "persist_failure.1" %> \out -> do liftIO $ appendFile "persist_failure.log" "[pre]" need ["persist_failure.2"] liftIO $ appendFile "persist_failure.log" "[post]" writeFile' out "" "persist_failure.2" %> \out -> do src <- readFile' "persist_failure.3" liftIO $ print ("persist_failure.3", src) if src == "die" then do liftIO $ appendFile "persist_failure.log" "[err]" fail "die" else writeFileChanged out src "fast_failure" %> \_ -> do liftIO $ sleep 0.1 fail "die" "slow_success" %> \out -> do liftIO $ sleep 20 writeFile' out "" addOracle $ \(BadBinary x) -> pure $ BadBinary $ 'b':x "badinput" %> \out -> do askOracle $ BadBinary "bad" liftIO $ appendFile out "x" "badoutput" %> \out -> do askOracle $ BadBinary "ad" liftIO $ appendFile out "x" "badnone" %> \out -> do alwaysRerun liftIO $ appendFile out "x" "produces1" %> \out -> do produces [out <.> "also"] writeFile' (out <.> "also") "" writeFile' out "" "produces2" %> \out -> do produces [out <.> "also"] writeFile' out "" "finalfinal" %> \out -> do writeFile' out "" lock <- liftIO newLock let output = withLock lock . appendFile out liftIO (sleep 100) `actionFinally` (output "X" >> sleep 0.1) `actionFinally` output "Y" let catching out = flip actionCatch $ \(e :: SomeException) -> writeFile' out $ show e "catch1" %> \out -> catching out $ fail "magic1" "catch2" %> \out -> catching out $ liftIO $ killThread =<< myThreadId "catch3.1" %> \out -> fail "magic3" "catch3.2" %> \out -> catching out $ need ["catch3.1"] -- not tested by default since only causes an error when idle GC is turned on phony "block" $ liftIO $ putStrLn $ let x = x in x test build = do -- on Windows, file paths may end up with \ separators, make sure we can still match them let crash args parts = assertExceptionAfter (replace "\\" "/") parts (build $ "--quiet" : args) build ["clean"] writeFile "chain.1" "x" build ["chain.3","--sleep"] writeFile "chain.1" "err" crash ["chain.3"] ["err_chain"] crash ["norule"] ["norule_isavailable"] crash ["failcreate"] ["failcreate"] crash ["failcreates"] ["failcreates"] crash ["recursive_"] ["recursive_","intermediate_","recursive"] crash ["rec1","rec2"] ["rec1","rec2","indirect recursion","recursive"] notMacCI $ crash ["systemcmd"] ["systemcmd","random_missing_command", "at cmd, called at"] crash ["stack1"] ["stack1","stack2","stack3","crash"] b <- IO.doesFileExist "staunch1" when b $ removeFile "staunch1" crash ["staunch1","staunch2","-j2"] ["crash"] assertBoolIO (not <$> IO.doesFileExist "staunch1") "File should not exist, should have crashed first" crash ["staunch1","staunch2","-j2","--keep-going","--silent"] ["crash"] assertBoolIO (IO.doesFileExist "staunch1") "File should exist, staunch should have let it be created" crash ["finally1"] ["die"] assertContents "finally1" "1" build ["finally2"] assertContents "finally2" "1" crash ["exception1"] ["die"] assertContents "exception1" "1" build ["exception2"] assertContents "exception2" "0" crash ["retry0"] ["positive","0"] crash ["retry1"] ["die"] build ["retry4"] forM_ ["finally3","finally4"] $ \name -> do t <- forkIO $ ignore $ build [name,"--exception"] retry 10 $ sleep 0.1 >> assertContents name "0" throwTo t (IndexOutOfBounds "test") retry 10 $ sleep 0.1 >> assertContents name "1" crash ["resource"] ["cannot currently introduce a dependency","withResource","resource_name"] build ["overlap.foo"] assertContents "overlap.foo" "overlap.*" build ["overlap.txt"] assertContents "overlap.txt" "overlap.txt" crash ["overlap.txx"] $ ["key matches multiple rules","matched: 4","overlap.txx","overlap.t*","overlap.*","*.tox"] ++ ["Test/Errors.hs"] crash ["tempfile"] ["tempfile-died"] src <- readFile "tempfile" assertMissing src build ["tempdir"] crash ["--die"] ["Shake","death error","Test/Errors.hs"] putStrLn "## BUILD errors" (out,_) <- IO.captureOutput $ build [] assertBool ("nothing to do" `isInfixOf` out) $ "Expected 'nothing to do', but got: " ++ out putStrLn "## BUILD errors fail1 fail2 -k -j2" (out,_) <- IO.captureOutput $ try_ $ build ["fail1","fail2","-k","-j2",""] assertBool ("die1" `isInfixOf` out && "die2" `isInfixOf` out) $ "Expected 'die1' and 'die2', but got: " ++ out crash ["fresh_dir"] ["expected a file, got a directory","fresh_dir"] crash ["need_dir"] ["expected a file, got a directory","existing_dir"] -- check errors don't persist to the database, #428 writeFile "persist_failure.log" "" writeFile "persist_failure.3" "test" build ["persist_failure.1","--sleep"] writeFile "persist_failure.3" "die" crash ["persist_failure.1","--sleep"] [] assertContents "persist_failure.log" "[pre][post][err][pre]" writeFile "persist_failure.3" "test" build ["persist_failure.1","--sleep"] assertContents "persist_failure.log" "[pre][post][err][pre]" writeFile "persist_failure.3" "more" build ["persist_failure.1"] assertContents "persist_failure.log" "[pre][post][err][pre][pre][post]" -- check a fast failure aborts a slow success (t, _) <- duration $ crash ["fast_failure","slow_success","-j2"] ["die"] assertBool (t < 10) $ "Took too long, expected < 10, got " ++ show t -- for exceptions on Key we die while reading the database, and restart from scratch build ["badinput"] build ["badinput","--silent"] assertContents "badinput" "xx" build ["badnone","--silent"] -- must be able to still run other rules assertContents "badnone" "x" -- for exceptions on Value we die while running the rule that requires it build ["badoutput"] crash ["badoutput"] ["badoutput","BadBinary"] build ["badnone"] -- must be able to still run other rules assertContents "badnone" "xx" -- check that produces works build ["produces1"] crash ["produces2"] ["produces","produces2.also"] -- check finally doesn't run twice, See https://github.com/ndmitchell/shake/issues/611 t <- forkIO $ build ["finalfinal","--quiet"] sleep 0.2 killThread t sleep 0.5 assertContents "finalfinal" "XY" build ["catch1"] assertContentsInfix "catch1" "magic1" crash ["catch2"] [show ThreadKilled] crash ["catch3.2"] ["magic3"] shake-0.19.8/src/Test/Existence.hs0000644000000000000000000000140607346545000015067 0ustar0000000000000000module Test.Existence(main) where import Development.Shake import Development.Shake.Internal.FileInfo import Development.Shake.Internal.FileName import System.Directory import System.FilePath import General.Extra main :: IO () -> IO () main _ = do cwd <- getCurrentDirectory someFiles <- getDirectoryFilesIO cwd ["*"] let someFile = headErr someFiles assertIsJust $ getFileInfo False $ fileNameFromString someFile let fileThatCantExist = someFile "fileThatCantExist" assertIsNothing $ getFileInfo False $ fileNameFromString fileThatCantExist assertIsJust :: IO (Maybe a) -> IO () assertIsJust action = do Just _ <- action pure () assertIsNothing :: IO (Maybe a) -> IO () assertIsNothing action = do Nothing <- action pure () shake-0.19.8/src/Test/FileLock.hs0000644000000000000000000000213507346545000014630 0ustar0000000000000000 module Test.FileLock(main) where import Development.Shake import Control.Concurrent.Extra import Control.Exception.Extra import Control.Monad import Data.Either.Extra import System.Time.Extra import System.Info.Extra import Test.Type main = testBuild test $ action $ do putInfo "Starting sleep" liftIO $ sleep 5 putInfo "Finished sleep" -- Disabled under Mac because it fails, see #560 -- Reported as working locally under APFS, so may just be the older HFS+ as used by Travis CI test build = unless isMac $ do -- check it fails exactly once time <- offsetTime lock <- newLock let out msg = do t <- time; withLock lock $ print (t, msg) out "before onceFork" a <- onceFork $ do out "a1"; build ["-VVV"]; out "a2" b <- onceFork $ do out "b1"; build ["-VVV"]; out "b2" out "after onceFork" a <- try_ a out "after try a" b <- try_ b out "after try b" when (length (filter isLeft [a,b]) /= 1) $ fail $ "Expected one success and one failure, got " ++ show [a,b] -- check it succeeds after the lock has been held build [] shake-0.19.8/src/Test/FilePath.hs0000644000000000000000000001132207346545000014632 0ustar0000000000000000 module Test.FilePath(main) where import Development.Shake.FilePath import Development.Shake import qualified System.FilePath as Native import Test.Type import Test.QuickCheck import Control.Monad import Data.List.Extra import qualified Data.ByteString.Char8 as BS import qualified Development.Shake.Internal.FileName as BS import System.Info.Extra import System.Directory import qualified System.IO.Extra as IO import General.Extra newtype File = File String deriving Show instance Arbitrary File where arbitrary = fmap File $ listOf $ oneof $ map pure "a /\\:." shrink (File x) = map File $ shrink x main = testSimple $ do let infix 1 === a === b = a Test.Type.=== b -- duplicate definition in QuickCheck 2.7 and above let norm x = let s = toStandard $ normaliseEx x b = BS.unpack (BS.filepathNormalise $ BS.pack x) in if s == b then s else error $ show ("Normalise functions differ",x,s,b) -- basic examples norm "" === "." norm "." === "." norm "/" === "/" norm "./" === "./" norm "/." === "/." norm "/./" === "/" norm "a/." === "a" norm "./a" === "a" norm "./a/." === "a" norm "./a/./" === "a/" norm "a/.." === "." norm "a/./.." === "." norm "a/../" === "./" norm "/a/../" === "/" norm "/a/./../" === "/" norm "../a" === "../a" norm "/../a/" === "/../a/" -- more realistic examples norm "neil//./test/moo/../bar/bob/../foo" === "neil/test/bar/foo" norm "bar/foo" === "bar/foo" norm "bar/foo/" === "bar/foo/" norm "../../foo" === "../../foo" norm "foo/../..///" === "../" norm "foo/bar/../../neil" === "neil" norm "foo/../bar/../neil" === "neil" norm "/foo/bar" === "/foo/bar" norm "//./" === (if isWindows then "//" else "/") norm "//foo/./bar" === (if isWindows then "//foo/bar" else "/foo/bar") when isWindows $ norm "c:\\foo\\bar" === "c:/foo/bar" when isWindows $ normaliseEx "foo/bar\\baz" === "foo\\bar\\baz" Success{} <- quickCheckWithResult stdArgs{maxSuccess=1000} $ \(File x) -> let y = norm x sep = Native.isPathSeparator noDrive = if isWindows then drop1 else id ps = [y /= "" ,null x || (sep (headErr x) == sep (headErr y) && sep (last x) == sep (last y)) ,not $ "/./" `isInfixOf` y ,not isWindows || '\\' `notElem` y ,not $ "//" `isInfixOf` noDrive y ,".." `notElem` dropWhile (== "..") (dropWhile (\x -> all isPathSeparator x || isDrive x) $ splitDirectories y) ,norm y == y] in if and ps then True else error (show (x, y, ps)) dropDirectory1 "aaa/bbb" === "bbb" dropDirectory1 "aaa/" === "" dropDirectory1 "aaa" === "" dropDirectory1 "" === "" takeDirectory1 "aaa/bbb" === "aaa" takeDirectory1 "aaa/" === "aaa" takeDirectory1 "aaa" === "aaa" replaceDirectory1 "root/file.ext" "directory" === "directory" ++ [pathSeparator] ++ "file.ext" replaceDirectory1 "root/foo/bar/file.ext" "directory" === "directory" ++ [pathSeparator] ++ "foo/bar/file.ext" searchPathSeparator === Native.searchPathSeparator pathSeparators === Native.pathSeparators if isWindows then toNative "//this is a///test\\here/" === "\\\\this is a\\\\\\test\\here\\" else toNative "//this is a///test\\here/" === "//this is a///test\\here/" -- check common manipulations work ("//*" <.> "foo") === "//*.foo" toStandard ("a" "b" "c" "*" <.> "exe") === "a/b/c//*.exe" ("a" "/b/c") === "a//b/c" -- check makeRelativeEx IO.withTempDir $ \ cwd -> do setCurrentDirectory cwd createDirectory (cwd "a") createDirectory (cwd "b") writeFile (toNative (cwd "b/file.out")) "foo" createDirectory (cwd "e") createDirectory (cwd "e/f") createDirectory (cwd "e/f/g") -- on Windows creating symlinks requires additional privileges -- so skip the test there -- portable symlinks only available on GHC 8.4 -- unless isWindows $ -- createFileLink (cwd "e/f/g/") (cwd "c") let f a b c = makeRelativeEx a (normalise b) >>= (=== fmap normalise c) f "/x/y/" "/x/y/z" $ Just "z" f (cwd "c") "../b/file.out" $ Just "../b/file.out" f "a" "b/file.out" $ Just "b/file.out" f (cwd "a") (cwd "b/file.out") $ Just "../b/file.out" -- unless isWindows $ do -- f (cwd "c") (cwd "b/file.out") $ Just "../../../b/file.out" -- f "c" (cwd "b/file.out") $ Just "../../../b/file.out" -- f (cwd "c/../../../a") (cwd "b/file.out") $ Just "../b/file.out" shake-0.19.8/src/Test/FilePattern.hs0000644000000000000000000002015307346545000015355 0ustar0000000000000000 module Test.FilePattern(main) where import Development.Shake.Internal.FilePattern import Development.Shake.FilePath import Control.Monad import System.IO.Unsafe import System.Info.Extra import Data.List.Extra import Test.Type import Test.QuickCheck hiding ((===)) newtype Pattern = Pattern FilePattern deriving (Show,Eq) newtype Path = Path FilePath deriving (Show,Eq) -- Since / and * are the only "interesting" elements, just add ab to round out the set instance Arbitrary Pattern where arbitrary = fmap Pattern $ listOf $ elements "\\/*ab" shrink (Pattern x) = map Pattern $ shrinkList (\x -> ['/' | x == '\\']) x instance Arbitrary Path where arbitrary = fmap Path $ listOf $ elements "\\/ab" shrink (Path x) = map Path $ shrinkList (\x -> ['/' | x == '\\']) x main = testSimple $ do internalTest let norm = filter (/= ".") . split isPathSeparator let f b pat file = do assertBool (b == (pat ?== file)) $ show pat ++ " ?== " ++ show file ++ "\nEXPECTED: " ++ show b assertBool (b == (pat `walker` file)) $ show pat ++ " `walker` " ++ show file ++ "\nEXPECTED: " ++ show b when b $ assertBool (norm (substitute (extract pat file) pat) == norm file) $ "FAILED substitute/extract property\nPattern: " ++ show pat ++ "\nFile: " ++ show file ++ "\n" ++ "Extracted: " ++ show (extract pat file) ++ "\nSubstitute: " ++ show (substitute (extract pat file) pat) f True "//*.c" "foo/bar/baz.c" f True "**/*.c" "foo/bar/baz.c" f True (toNative "//*.c") "foo/bar\\baz.c" f True (toNative "**/*.c") "foo/bar\\baz.c" f True "*.c" "baz.c" f True "//*.c" "baz.c" f True "**/*.c" "baz.c" f True "test.c" "test.c" f False "*.c" "foor/bar.c" f False "*/*.c" "foo/bar/baz.c" f False "foo//bar" "foobar" f False "foo/**/bar" "foobar" f False "foo//bar" "foobar/bar" f False "foo/**/bar" "foobar/bar" f False "foo//bar" "foo/foobar" f False "foo/**/bar" "foo/foobar" f True "foo//bar" "foo/bar" f True "foo/**/bar" "foo/bar" f True "foo/bar" (toNative "foo/bar") f True (toNative "foo/bar") "foo/bar" f True (toNative "foo/bar") (toNative "foo/bar") f True "//*" "/bar" f False "**/*" "/bar" f True "/bob//foo" "/bob/this/test/foo" f True "/bob/**/foo" "/bob/this/test/foo" f False "/bob//foo" "bob/this/test/foo" f False "/bob/**/foo" "bob/this/test/foo" f True "bob//foo/" "bob/this/test/foo/" f True "bob/**/foo/" "bob/this/test/foo/" f False "bob//foo/" "bob/this/test/foo" f False "bob/**/foo/" "bob/this/test/foo" f True "a//" "a" f True "a/**" "a" f True "/a//" "/a" f True "/a/**" "/a" f True "///a//" "/a" f False "**/a/**" "/a" f False "///" "" f True "///" "/" f True "/**" "/" f True "///" "a/" f True "**/" "a/" f True "////" "" f True "**/**" "" f True "x///y" "x/y" f True "x/**/y" "x/y" f True "x///" "x/" f True "x/**/" "x/" f True "x///" "x/foo/" f True "x/**/" "x/foo/" f False "x///" "x" f False "x/**/" "x" f True "x///" "x/foo/bar/" f True "x/**/" "x/foo/bar/" f False "x///" "x/foo/bar" f False "x/**/" "x/foo/bar" f True "x///y" "x/z/y" f True "x/**/*/y" "x/z/y" f True "" "" f False "" "y" f False "" "/" f True "*/*" "x/y" f False "*/*" "x" f True "//*" "x" f True "**/*" "x" f True "//*" "" f True "**/*" "" f True "*//" "x" f True "*/**" "x" f True "*//" "" f True "*/**" "" f True "*//*" "x/y" f True "*/**/*" "x/y" f False "*//*" "" f False "*/**/*" "" f False "*//*" "x" f False "*/**/*" "x" f False "*//*//*" "x/y" f False "*/**/*/**/*" "x/y" f True "//*/" "/" f False "**/*/" "/" f True "*/////" "/" f True "*/**/**/" "/" f False "b*b*b*//" "bb" f False "b*b*b*/**" "bb" f False "**" "/" f False "**/x" "/x" f True "**" "x/" f (not isWindows) "**" "\\\\drive" f (not isWindows) "**" "C:\\drive" f (not isWindows) "**" "C:drive" -- We support ignoring '.' values in FilePath as they are inserted by @filepath@ a lot f True "./file" "file" f True ("" "file") "file" f True "foo/./bar" "foo/bar" f True "foo/./bar" "foo/./bar" f False "foo/./bar" "foo/bob" filePattern "**/*.c" "test.txt" === Nothing filePattern "**/*.c" "foo.c" === Just ["","foo"] filePattern "**/*.c" "bar/baz/foo.c" === Just ["bar/baz/","foo"] filePattern "**/*.c" "bar\\baz\\foo.c" === Just (if isWindows then ["bar/baz/","foo"] else ["","bar\\baz\\foo"]) simple "a*b" === False simple "a//b" === False simple "a/**/b" === False simple "/a/b/cccc_" === True simple "a///b" === False simple "a/**/b" === False assertBool (compatible []) "compatible" assertBool (compatible ["//*a.txt","foo//a*.txt"]) "compatible" assertBool (compatible ["**/*a.txt","foo/**/a*.txt"]) "compatible" assertBool (compatible ["//*a.txt","foo/**/a*.txt"]) "compatible" assertBool (not $ compatible ["//*a.txt","foo//a*.*txt"]) "compatible" assertBool (not $ compatible ["**/*a.txt","foo/**/a*.*txt"]) "compatible" extract "//*a.txt" "foo/bar/testa.txt" === ["foo/bar/","test"] extract "**/*a.txt" "foo/bar/testa.txt" === ["foo/bar/","test"] extract "//*a.txt" "testa.txt" === ["","test"] extract "**/*a.txt" "testa.txt" === ["","test"] extract "//a.txt" "a.txt" === [""] extract "**/a.txt" "a.txt" === [""] extract "//a.txt" "/a.txt" === ["/"] extract "a//b" "a/b" === [""] extract "a/**/b" "a/b" === [""] extract "a//b" "a/x/b" === ["x/"] extract "a/**/b" "a/x/b" === ["x/"] extract "a//b" "a/x/y/b" === ["x/y/"] extract "a/**/b" "a/x/y/b" === ["x/y/"] extract "a///b" "a/x/y/b" === ["x/y/"] extract "a/**/**/b" "a/x/y/b" === ["","x/y/"] extract "//*a*.txt" "testada.txt" === ["","test","da"] extract "**/*a*.txt" "testada.txt" === ["","test","da"] extract (toNative "//*a*.txt") "testada.txt" === ["","test","da"] extract (toNative "**/*a*.txt") "testada.txt" === ["","test","da"] substitute ["","test","da"] "//*a*.txt" === "testada.txt" substitute ["","test","da"] "**/*a*.txt" === "testada.txt" substitute ["foo/bar/","test"] "//*a.txt" === "foo/bar/testa.txt" substitute ["foo/bar/","test"] "**/*a.txt" === "foo/bar/testa.txt" (False, Walk _) <- pure $ walk ["*.xml"] (False, Walk _) <- pure $ walk ["//*.xml"] (False, Walk _) <- pure $ walk ["**/*.xml"] (False, WalkTo ([], [("foo",Walk _)])) <- pure $ walk ["foo//*.xml"] (False, WalkTo ([], [("foo",Walk _)])) <- pure $ walk ["foo/**/*.xml"] (False, WalkTo ([], [("foo",WalkTo ([],[("bar",Walk _)]))])) <- pure $ walk ["foo/bar/*.xml"] (False, WalkTo (["a"],[("b",WalkTo (["c"],[]))])) <- pure $ walk ["a","b/c"] ([], [("foo",WalkTo ([],[("bar",Walk _)]))]) <- let (False, Walk f) = walk ["*/bar/*.xml"] in pure $ f ["foo"] (False, WalkTo ([],[("bar",Walk _),("baz",Walk _)])) <- pure $ walk ["bar/*.xml","baz//*.c"] (False, WalkTo ([],[("bar",Walk _),("baz",Walk _)])) <- pure $ walk ["bar/*.xml","baz/**/*.c"] (False, WalkTo ([], [])) <- pure $ walk [] (True, Walk _) <- pure $ walk ["//"] (True, Walk _) <- pure $ walk ["**"] (True, WalkTo _) <- pure $ walk [""] Success{} <- quickCheckWithResult stdArgs{maxSuccess=1000} $ \(Pattern p) (Path x) -> let label _ = property in -- Ignore label to workaround QuickCheck space-leak -- See #450 and https://github.com/nick8325/quickcheck/pull/93 let b = p ?== x in (if b then property else label "No match") $ unsafePerformIO $ do f b p x; pure True pure () walker :: FilePattern -> FilePath -> Bool -- Slight difference of opinion since Walker is always relative to something walker a b | isRelativePattern a, not $ isRelativePath b = False walker a b = f (split isPathSeparator b) $ snd $ walk [a] where f (".":xs) w = f xs w f (x:xs) (Walk op) = f (x:xs) $ WalkTo $ op [x] f [x] (WalkTo (file, _ )) = x `elem` file f (x:xs) (WalkTo (_ , dir)) | Just w <- lookup x dir = f xs w f _ _ = False shake-0.19.8/src/Test/Files.hs0000644000000000000000000000327407346545000014207 0ustar0000000000000000 module Test.Files(main) where import Development.Shake import Development.Shake.FilePath import System.Directory import Test.Type import Control.Monad import Data.List main = testBuild test $ do want ["even.txt","odd.txt"] "A1-plus-B" %> \out -> do a1 <- readFileLines "A1" b <- readFileLines "B" writeFileLines out $ a1 ++ b ["A1", "A2"] &%> \[o1, o2] -> do writeFileLines o1 ["This is", "A1"] writeFileLines o2 ["This is", "A2"] "B" %> \out -> writeFileLines out ["This is", "B"] ["even.txt","odd.txt"] &?%> \[evens,odds] -> do src <- readFileLines "numbers.txt" let (es,os) = partition even $ map read src writeFileLines evens $ map show es writeFileLines odds $ map show os ["dir1/out.txt","dir2/out.txt"] &?%> \[a,b] -> do writeFile' a "a" writeFile' b "b" ["or1.txt","or2.txt","or*.txt"] |%> \x -> writeFile' x x (\x -> let dir = takeDirectory x in if takeFileName dir /= "pred" then Nothing else Just [dir "a.txt",dir "b.txt"]) &?> \outs -> mapM_ (`writeFile'` "") outs test build = do forM_ [[],["--usepredicate"]] $ \args -> do let nums = unlines . map show writeFile "numbers.txt" $ nums [1,2,4,5,2,3,1] build ("--sleep":args) assertContents "even.txt" $ nums [2,4,2] assertContents "odd.txt" $ nums [1,5,3,1] build ["clean"] build ["--no-build","--report=-"] build ["dir1/out.txt"] build ["pred/a.txt"] -- Test #496 build ["A1-plus-B"] removeFile "A2" build ["A1-plus-B"] build ["or2.txt","or4.txt"] assertContents "or4.txt" "or4.txt" shake-0.19.8/src/Test/Forward.hs0000644000000000000000000000372007346545000014545 0ustar0000000000000000 module Test.Forward(main) where import Data.Char import Data.List.Extra import Development.Shake import System.Info.Extra import Development.Shake.Forward import Development.Shake.FilePath import Test.Type import System.IO.Extra as IO main = testBuild test $ forwardRule $ do cs <- getDirectoryFiles "" ["*.c"] os <- forP cs $ \c -> do let o = c <.> "o" cache $ cmd "gcc -c" [c] "-o" [o] pure o cache $ cmd "gcc -o" ["Main" <.> exe] os cache $ cmd ["." "Main" <.> exe] (FileStdout "output.txt") -- Doing this way to test cacheAction with arguments -- any real code should use a tracked readFile and avoid passing arguments to the closure src <- liftIO $ IO.readFile' "output.txt" cacheActionWith "reducer" src $ writeFile' "out.txt" $ filter isUpper src checkVaild act = do b <- hasTracker if not b then putStrLn "Warning: Not running forward test (no tracker)" else if isMac then putStrLn "Warning: Not running forward test (doesn't work on Mac)" else act test build = checkVaild $ do -- first clean then copy the source files over build ["clean"] copyDirectoryChanged (shakeRoot "src/Test/C") "." -- build and rebuild build ["--forward"] assertContents "output.txt" "Hello Shake Users!\n" assertContents "out.txt" "HSU" -- check that cacheAction doesn't rerun when it shouldn't writeFile "out.txt" "HHH" build ["-j2","--forward"] assertContents "output.txt" "Hello Shake Users!\n" assertContents "out.txt" "HHH" -- modify the constants orig <- IO.readFile' "constants.c" writeFile "constants.c" $ replace "Shake" "Rattle" orig build ["-j2","--forward"] assertContents "output.txt" "Hello Rattle Users!\n" assertContents "out.txt" "HRU" -- put it back writeFile "constants.c" orig build ["-j2","--forward"] assertContents "output.txt" "Hello Shake Users!\n" assertContents "out.txt" "HSU" shake-0.19.8/src/Test/History.hs0000644000000000000000000000414007346545000014577 0ustar0000000000000000{-# LANGUAGE TypeFamilies #-} {-# OPTIONS_GHC -Wno-orphans #-} module Test.History(main) where import Control.Monad.Extra import Development.Shake import Test.Type import General.Extra import General.GetOpt import System.Directory import Development.Shake.Internal.History.Symlink data Args = Die deriving (Eq,Enum,Bounded,Show) type instance RuleResult FilePath = String main = testBuildArgs test optionsEnum $ \args -> do let die :: a -> a die x = if Die `elem` args then error "Die" else x phony "Phony" $ pure () "Phony.txt" %> \out -> do need ["Phony"] copyFile' "In.txt" out "OutFile.txt" %> \out -> die $ copyFile' "In.txt" out "OutFile.link.txt" %> \out -> die $ do need ["In.txt"] whenJustM (liftIO $ createLinkMaybe "In.txt" out) $ \_ -> -- just is equivalent to an error happening, so just copy the file writeFile' out =<< readFile' "In.txt" reader <- addOracleCache $ \x -> die (readFile' x) "OutOracle.txt" %> \out -> do historyDisable writeFile' out =<< reader "In.txt" ["OutFiles1.txt","OutFiles2.txt"] &%> \[out1, out2] -> die $ do copyFile' "In.txt" out1 copyFile' "In.txt" out2 test build = forM_ [[],["--share-copy"]] $ \args -> do let setIn = writeFile "In.txt" let outs = ["OutFile.txt","OutOracle.txt","OutFiles1.txt","OutFiles2.txt","Phony.txt"] -- ,"OutFile.link.txt"] let checkOut x = mapM_ (`assertContents` x) outs build ["clean"] setIn "1" build $ args ++ ["--share","--sleep"] ++ outs checkOut "1" setIn "2" build $ args ++ ["--share","--sleep"] ++ outs checkOut "2" setIn "1" assertException [] $ build ["OutFile.txt","--die","--quiet","--sleep"] build $ args ++ ["--die","--share"] ++ outs checkOut "1" setIn "2" mapM_ removeFile_ outs build $ args ++ ["--die","--share"] ++ outs checkOut "2" setIn "2" removeFile ".shake.database" build $ args ++ ["--die","--share"] ++ outs checkOut "2" shake-0.19.8/src/Test/Journal.hs0000644000000000000000000000161707346545000014556 0ustar0000000000000000 module Test.Journal(main) where import Control.Monad import Data.IORef import Development.Shake import Development.Shake.FilePath import Test.Type import System.IO.Unsafe {-# NOINLINE rebuilt #-} rebuilt :: IORef Int rebuilt = unsafePerformIO $ newIORef 0 main = testBuild test $ do want ["a.out","b.out","c.out"] "*.out" %> \out -> do liftIO $ atomicModifyIORef rebuilt $ \a -> (a+1,()) copyFile' (out -<.> "in") out test build = do let change x = writeFile (x <.> "in") x let count x = do before <- readIORef rebuilt build ["--sleep"] after <- readIORef rebuilt x === after - before change "a" change "b" change "c" count 3 -- test that compressing the database doesn't corrupt anything replicateM_ 4 $ do change "a" count 1 change "a" change "c" count 2 shake-0.19.8/src/Test/Lint.hs0000644000000000000000000001207707346545000014054 0ustar0000000000000000{-# LANGUAGE TypeFamilies, GeneralizedNewtypeDeriving, DeriveDataTypeable #-} module Test.Lint(main) where import Development.Shake import Development.Shake.Classes import Development.Shake.FilePath import General.Extra import Test.Type import Control.Exception import System.Directory as IO import System.Info.Extra import Control.Monad.Extra newtype Zero = Zero () deriving (Eq, Show, NFData, Typeable, Hashable, Binary) type instance RuleResult Zero = Zero main = testBuild test $ do addOracle $ \Zero{} -> do liftIO $ createDirectoryRecursive "dir" liftIO $ setCurrentDirectory "dir" pure $ Zero () "changedir" %> \out -> do Zero () <- askOracle $ Zero () writeFile' out "" "pause.*" %> \out -> do liftIO $ sleep 0.2 need ["cdir" <.> takeExtension out] writeFile' out "" "cdir.*" %> \out -> do pwd <- liftIO getCurrentDirectory let dir2 = "dir" ++ takeExtension out liftIO $ createDirectoryRecursive dir2 liftIO $ setCurrentDirectory dir2 liftIO $ sleep 0.4 liftIO $ setCurrentDirectory pwd writeFile' out "" "createonce" %> \out -> writeFile' out "X" "createtwice" %> \out -> do need ["createonce"] liftIO sleepFileTime writeFile' "createonce" "Y" writeFile' out "" "recordtwice" %> \out -> do alwaysRerun trackWrite ["recordtwice_"] trackWrite ["recordtwice_"] writeFile' "recordtwice_" "" writeFile' out "" "listing" %> \out -> do writeFile' (out <.> "ls1") "" getDirectoryFiles "" ["//*.ls*"] writeFile' (out <.> "ls2") "" writeFile' out "" "existance" %> \out -> do Development.Shake.doesFileExist "exists" writeFile' "exists" "" writeFile' out "" "gen*" %> \out -> writeFile' out out "needed1" %> \out -> do needed ["gen1"] writeFile' out "" "needed2" %> \out -> do orderOnly ["gen2"] needed ["gen2"] writeFile' out "" "tracker-write1" %> \out -> do gen "x" $ out <.> "txt" need [out <.> "txt"] writeFile' out "" "tracker-write2" %> \out -> do gen "x" $ out <.> "txt" writeFile' out "" "tracker-source2" %> \out -> copyFile' "tracker-source1" out "tracker-read1" %> \out -> do access "tracker-source1" writeFile' out "" "tracker-read2" %> \out -> do access "tracker-source1" need ["tracker-source1"] writeFile' out "" "tracker-read3" %> \out -> do access "tracker-source2" need ["tracker-source2"] writeFile' out "" "tracker-compile.o" %> \out -> do need ["tracker-source.c", "tracker-source.h"] cmd "gcc" ["-c", "tracker-source.c", "-o", out] "tracker-compile-auto.o" %> \out -> do need ["tracker-source.c"] cmd AutoDeps "gcc" ["-c", "tracker-source.c", "-o", out] "../lint2/tracker-relative" %> \out -> writeFile' out "tracker-relative" "tracker-relative1" %> \out -> do need ["../lint2/tracker-relative"] access "../lint2/tracker-relative" writeFile' out "tracker-relative" "tracker-relative2" %> \out -> do access "../lint2/tracker-relative" writeFile' out "tracker-relative" where gen t f = cmd Shell "echo" t ">" (toNative f) :: Action () access f = if isWindows then cmd_ Shell "type" (toNative f) "> nul" else cmd_ Shell "cat" f "> /dev/null" test build = do dir <- getCurrentDirectory let crash args parts = assertException parts (build $ "--quiet" : args) `finally` setCurrentDirectory dir crash ["changedir"] ["current directory has changed"] build ["cdir.1","cdir.2","-j1"] build ["--clean","cdir.1","pause.2","-j1"] crash ["--clean","cdir.1","pause.2","-j2"] ["output","lint","current directory has changed"] crash ["existance"] ["changed since being depended upon"] crash ["createtwice"] ["changed since being depended upon"] build ["recordtwice"] crash ["listing"] ["changed since being depended upon","listing.ls2"] crash ["--clean","listing","existance"] ["changed since being depended upon"] crash ["needed1"] ["'needed' file required rebuilding"] build ["needed2"] whenM hasTracker $ do writeFile "tracker-source1" "" writeFile "tracker-source2" "" writeFile "tracker-source.c" "#include \n#include \"tracker-source.h\"\n" writeFile "tracker-source.h" "" crash ["tracker-write1"] ["not have its creation tracked","tracker-write1","tracker-write1.txt"] build ["tracker-write2"] crash ["tracker-read1"] ["used but not depended upon","tracker-source1"] build ["tracker-read2"] crash ["tracker-read3"] ["depended upon after being used","tracker-source2"] build ["tracker-compile.o"] build ["tracker-compile-auto.o"] build ["tracker-relative1"] crash ["tracker-relative2"] ["lint2/tracker-relative"] shake-0.19.8/src/Test/Live.hs0000644000000000000000000000124707346545000014042 0ustar0000000000000000 module Test.Live(main) where import Development.Shake import Test.Type main = testBuild test $ do "foo" %> \ out -> do need ["bar"] writeFile' out "" "bar" %> \out -> writeFile' out "" "baz" %> \out -> writeFile' out "" test build = do build ["clean"] build ["foo","baz","--live=live.txt"] assertContentsUnordered "live.txt" $ words "foo bar baz" build ["foo","baz","--live=live.txt"] assertContentsUnordered "live.txt" $ words "foo bar baz" build ["foo","--live=live.txt"] assertContentsUnordered "live.txt" $ words "foo bar" build ["bar","--live=live.txt"] assertContentsUnordered "live.txt" $ words "bar" shake-0.19.8/src/Test/Manual.hs0000644000000000000000000000200107346545000014345 0ustar0000000000000000 module Test.Manual(main) where import Development.Shake import Development.Shake.FilePath import Test.Type import General.Extra import Data.Maybe import System.Info.Extra main = testSimple $ do -- we use .git as our destination, despite not being a real git repo -- so that search tools ignore it, and I don't get dupes for every source file let dest = ".git" copyDirectoryChanged (shakeRoot "docs/manual") dest copyDirectoryChanged (shakeRoot "src/Development") $ dest "Development" copyDirectoryChanged (shakeRoot "src/General") $ dest "General" copyFileChangedIO (shakeRoot "src/Paths.hs") $ dest "Paths_shake.hs" (_, gccPath) <- findGcc let opts = [Cwd dest, Shell, AddPath [] (maybeToList gccPath)] let cmdline = if isWindows then "build.bat" else "/bin/sh build.sh" cmd_ opts cmdline "-j2" assertExists $ dest "_build/run" <.> exe cmd_ opts cmdline cmd_ opts [cmdline,"clean"] assertMissing $ dest "_build/run" <.> exe shake-0.19.8/src/Test/Match.hs0000644000000000000000000000333307346545000014175 0ustar0000000000000000 -- | Test the rule matching facilities - alternatives, priority etc. module Test.Match(main) where import Development.Shake import Test.Type main = testBuild test $ do let output x file = writeFile' file x ["or*","*or"] |%> output "" alternatives $ do "alternative.t*" %> output "alternative.t*" "alternative.*" %> output "alternative.*" priority 100 $ priority 0 $ "priority.txt" %> output "100" priority 50 $ "priority.txt" %> output "50" alternatives $ do priority 20 $ "altpri.txt" %> output "20" priority 40 $ "altpri.txt" %> output "40" priority 30 $ "altpri.txt" %> output "30" alternatives $ do priority 21 $ "altpri2.txt" %> output "21" priority 22 $ "altpri2.txt" %> output "22" priority 23 $ "altpri2.txt" %> output "23" priority 55 $ alternatives $ "x" %> output "55" priority 51 $ "x" %> output "51" priority 42 $ alternatives $ "xx" %> output "42" priority 43 $ "xx" %> output "43" priority 10 $ do priority 6 $ "change" %> output "6" priority 7 $ "change" %> output "7" priority 8 $ "change" %> output "8" priority 9 $ "change" %> output "9" test build = do build ["clean"] build ["or"] build ["alternative.foo","alternative.txt"] assertContents "alternative.foo" "alternative.*" assertContents "alternative.txt" "alternative.t*" build ["priority.txt"] assertContents "priority.txt" "100" build ["altpri.txt","altpri2.txt"] assertContents "altpri.txt" "30" assertContents "altpri2.txt" "23" build ["x","xx"] assertContents "x" "55" assertContents "xx" "43" assertException ["matches multiple rules","3"] $ build ["change","--quiet"] shake-0.19.8/src/Test/Monad.hs0000644000000000000000000000604107346545000014176 0ustar0000000000000000 module Test.Monad(main) where import Test.Type import Development.Shake.Internal.Core.Monad import Data.Either.Extra import Data.IORef import Control.Concurrent import Control.Exception import Control.Monad.Extra import Control.Monad.IO.Class run :: ro -> rw -> RAW () () ro rw a -> IO a run ro rw m = do res <- newEmptyMVar runRAW pure ro rw m $ void . tryPutMVar res eitherM throwIO pure (readMVar res) main = testSimple $ do let conv x = mapLeft fromException x :: Either (Maybe ArithException) Int let dump rw = liftIO . (=== rw) =<< getRW -- test the basics plus exception handling run 1 "test" $ do dump "test" putRW "more" dump "more" res <- tryRAW $ do dump "more" modifyRW (++ "x") dump "morex" pure 100 liftIO $ conv res === Right 100 dump "morex" putRW "new" dump "new" res <- tryRAW $ do dump "new" modifyRW (++ "z") dump "newz" throwRAW Overflow error "Should not have reached here" pure 9 liftIO $ conv res === Left (Just Overflow) dump "newz" catchRAW (catchRAW (throwRAW Overflow) $ \_ -> modifyRW (++ "x")) $ \_ -> modifyRW (++ "y") dump "newzx" catchRAW (catchRAW (throwRAW Overflow) $ \e -> modifyRW (++ "x") >> throwRAW e) $ \_ -> modifyRW (++ "y") dump "newzxxy" -- test capture run 1 "test" $ do i <- captureRAW $ \k -> k $ Right 1 liftIO $ i === 1 i <- tryRAW $ captureRAW $ \k -> k $ Left $ toException Overflow liftIO $ conv i === Left (Just Overflow) captureRAW $ \k -> k $ Right () i <- tryRAW $ throwRAW Underflow liftIO $ conv i === Left (Just Underflow) -- catch does not scope too far res <- try $ run 1 "test" $ fmap (either show id) $ tryRAW $ captureRAW $ \_ -> throwIO Overflow res === Left Overflow res <- try $ run 1 "test" $ do captureRAW $ \_ -> throwIO Overflow pure "x" res === Left Overflow -- test for GHC bug 11555 runRAW pure 1 "test" (throw Overflow :: RAW () () Int String ()) $ \res -> mapLeft fromException res === Left (Just Overflow) -- catch works properly if continuation called multiple times ref <- newIORef [] run 1 "test" $ flip catchRAW (const $ liftIO $ modifyIORef ref ('x':)) $ do captureRAW $ \k -> do k $ Right () k $ Right () k $ Left $ toException Overflow k $ Right () k $ Left $ toException Overflow flip catchRAW (const $ liftIO $ modifyIORef ref ('y':)) $ throwRAW $ toException Overflow (===) "xyxyy" =<< readIORef ref -- what if we throw an exception inside the continuation of run ref <- newIORef 0 res <- try $ runRAW pure 1 "test" (pure 1) $ \_ -> do modifyIORef ref (+1) throwIO Overflow res === Left Overflow (=== 1) =<< readIORef ref shake-0.19.8/src/Test/Ninja.hs0000644000000000000000000001154407346545000014203 0ustar0000000000000000 module Test.Ninja(main) where import Development.Shake import qualified Development.Shake.Config as Config import System.Directory(copyFile, removeFile) import Control.Monad import General.GetOpt import General.Extra import Test.Type import qualified Data.HashMap.Strict as Map import Data.List.Extra import System.IO.Extra import qualified Run import System.Environment opts = Option "" ["arg"] (ReqArg Right "") "" -- | Set to True to test with real Ninja -- On Windows doesn't work because echo foo > 1 isn't supported real_ninja = False main = testBuildArgs test [opts] $ \opts -> do let real = "real" `elem` opts action $ if real || real_ninja then cmd "ninja" opts else liftIO $ withArgs ("--lint":"--report=report.html":opts) Run.main test build = do let runEx ninja shake = build $ "--exception" : "--no-report" : map ("--arg=" ++) (words ninja) ++ words shake let run ninja = runEx ninja [] let runFail ninja bad = assertException [bad] $ runEx ninja "--quiet" build ["clean"] run "-f../../src/Test/Ninja/test1.ninja" assertExists "out1.txt" run "-f../../src/Test/Ninja/test2.ninja" assertExists "out2.2" assertMissing "out2.1" build ["clean"] run "-f../../src/Test/Ninja/test2.ninja out2.1" assertExists "out2.1" assertMissing "out2.2" copyFile "../../src/Test/Ninja/test3-sub.ninja" "test3-sub.ninja" copyFile "../../src/Test/Ninja/test3-inc.ninja" "test3-inc.ninja" createDirectoryRecursive "subdir" copyFile "../../src/Test/Ninja/subdir/1.ninja" "subdir/1.ninja" copyFile "../../src/Test/Ninja/subdir/2.ninja" "subdir/2.ninja" run "-f../../src/Test/Ninja/test3.ninja" assertContentsWords "out3.1" "g4+b1+++i1" assertContentsWords "out3.2" "g4++++i1" assertContentsWords "out3.3" "g4++++i1" assertContentsWords "out3.4" "g4+++s1+s2" run "-f../../src/Test/Ninja/test4.ninja out" assertExists "out.txt" assertExists "out2.txt" run "-f../../src/Test/Ninja/test5.ninja" assertExists "output file" -- #565, check multi-file rules that don't create their contents run "-f../../src/Test/Ninja/test7.ninja" writeFile "nocreate.log" "" writeFile "nocreate.in" "" run "-f../../src/Test/Ninja/nocreate.ninja" assertContentsWords "nocreate.log" "x" run "-f../../src/Test/Ninja/nocreate.ninja" run "-f../../src/Test/Ninja/nocreate.ninja" assertContentsWords "nocreate.log" "x x x" writeFile "input" "" runFail "-f../../src/Test/Ninja/lint.ninja bad --lint" "'needed' file required rebuilding" run "-f../../src/Test/Ninja/lint.ninja good --lint" runFail "-f../../src/Test/Ninja/lint.ninja bad --lint" "not a pre-dependency" res <- fmap (drop1 . lines . fst) $ captureOutput $ runEx "-f../../src/Test/Ninja/compdb.ninja -t compdb cxx" "--quiet" want <- lines <$> readFile "../../src/Test/Ninja/compdb.output" let eq a b | (a1,'*':a2) <- break (== '*') a = unless (a1 `isPrefixOf` b && a2 `isSuffixOf` b) $ a === b | otherwise = a === b length want === length res zipWithM_ eq want res -- Test initial variable bindings and variables in include/subninja statements let test6 = "test6" copyFile "../../src/Test/Ninja/test6-sub.ninja" $ test6 ++ "-sub.ninja" copyFile "../../src/Test/Ninja/test6-inc.ninja" $ test6 ++ "-inc.ninja" copyFile "../../src/Test/Ninja/test6.ninja" $ test6 ++ ".ninja" config <- Config.readConfigFileWithEnv [("v1", test6)] $ test6 ++ ".ninja" -- The file included by subninja should have a separate variable scope Map.lookup "v2" config === Just "g2" run "-f../../src/Test/Ninja/phonyorder.ninja bar.txt" -- tests from ninjasmith: https://github.com/ndmitchell/ninjasmith/ run "-f../../src/Test/Ninja/redefine.ninja" assertContentsWords "redefine.txt" "version3 version2" run "-f../../src/Test/Ninja/buildseparate.ninja" assertContentsWords "buildseparate.txt" "XX" run "-f../../src/Test/Ninja/lexical.ninja" assertContentsWords "lexical.txt" "XFoo_BarXXFooX.bar" run "-f../../src/Test/Ninja/continuations.ninja" assertExists "continuations.txt" copyFile "../../src/Test/Ninja/restart.ninja" "restart.ninja" runEx "-frestart.ninja" "--sleep" assertExists "restart.txt" createDirectoryRecursive "directory1" createDirectoryRecursive "directory2" run "-f../../src/Test/Ninja/allow_directory.ninja" when False $ do -- currently fails because Shake doesn't match Ninja here run "-f../../src/Test/Ninja/outputtouch.ninja" assertContentsWords "outputtouch.txt" "hello" writeFile "outputtouch.txt" "goodbye" run "-f../../src/Test/Ninja/outputtouch.ninja" assertContentsWords "outputtouch.txt" "goodbye" removeFile "outputtouch.txt" run "-f../../src/Test/Ninja/outputtouch.ninja" assertContentsWords "outputtouch.txt" "hello" shake-0.19.8/src/Test/Ninja/0000755000000000000000000000000007346545000013642 5ustar0000000000000000shake-0.19.8/src/Test/Ninja/allow_directory.ninja0000644000000000000000000000023107346545000020061 0ustar0000000000000000rule create command = echo 1 > $out build root: phony || allow_directory.txt directory1 build allow_directory.txt: create | directory2 default root shake-0.19.8/src/Test/Ninja/buildseparate.ninja0000644000000000000000000000026707346545000017514 0ustar0000000000000000 rule record # produces XX command = hello command = echo $vBar $command > $out build buildseparate.txt: record vFoo = foo vBar = bar vBar = X${vFoo}${vBar}X shake-0.19.8/src/Test/Ninja/compdb.ninja0000644000000000000000000000244307346545000016132 0ustar0000000000000000# Copied from the Ninja repo ninja_required_version = 1.3 builddir = build cxx = g++ ar = ar cflags = -g -Wall -Wextra -Wno-deprecated -Wno-unused-parameter -fno-rtti $ -fno-exceptions -pipe -Wno-missing-field-initializers $ -DNINJA_PYTHON="python.exe" -O2 -DNDEBUG -D_WIN32_WINNT=0x0501 ldflags = -L$builddir -static rule cxx command = $cxx -MMD -MT $out -MF $out.d $cflags -c $in -o $out description = CXX $out depfile = $out.d deps = gcc rule ar command = cmd /c $ar cqs $out.tmp $in && move /Y $out.tmp $out description = AR $out rule link command = $cxx $ldflags -o $out $in $libs description = LINK $out # the depfile parser and ninja lexers are generated using re2c. # Core source files all build into ninja library. build $builddir\build.o: cxx src\build.cc build $builddir\build_log.o: cxx src\build_log.cc build $builddir\clean.o: cxx src\clean.cc rule doxygen command = doxygen $in description = DOXYGEN $in doxygen_mainpage_generator = src\gen_doxygen_mainpage.sh rule doxygen_mainpage command = $doxygen_mainpage_generator $in > $out description = DOXYGEN_MAINPAGE $out build $builddir\doxygen_mainpage: doxygen_mainpage README COPYING | $ $doxygen_mainpage_generator build doxygen: doxygen doc\doxygen.config | $builddir\doxygen_mainpage default ninja.exe shake-0.19.8/src/Test/Ninja/compdb.output0000644000000000000000000000204207346545000016366 0ustar0000000000000000[ { "directory": "*", "command": "g++ -MMD -MT build\\build.o -MF build\\build.o.d -g -Wall -Wextra -Wno-deprecated -Wno-unused-parameter -fno-rtti -fno-exceptions -pipe -Wno-missing-field-initializers -DNINJA_PYTHON=\"python.exe\" -O2 -DNDEBUG -D_WIN32_WINNT=0x0501 -c src\\build.cc -o build\\build.o", "file": "src\\build.cc" }, { "directory": "*", "command": "g++ -MMD -MT build\\build_log.o -MF build\\build_log.o.d -g -Wall -Wextra -Wno-deprecated -Wno-unused-parameter -fno-rtti -fno-exceptions -pipe -Wno-missing-field-initializers -DNINJA_PYTHON=\"python.exe\" -O2 -DNDEBUG -D_WIN32_WINNT=0x0501 -c src\\build_log.cc -o build\\build_log.o", "file": "src\\build_log.cc" }, { "directory": "*", "command": "g++ -MMD -MT build\\clean.o -MF build\\clean.o.d -g -Wall -Wextra -Wno-deprecated -Wno-unused-parameter -fno-rtti -fno-exceptions -pipe -Wno-missing-field-initializers -DNINJA_PYTHON=\"python.exe\" -O2 -DNDEBUG -D_WIN32_WINNT=0x0501 -c src\\clean.cc -o build\\clean.o", "file": "src\\clean.cc" } ] shake-0.19.8/src/Test/Ninja/continuations.ninja0000644000000000000000000000015407346545000017560 0ustar0000000000000000 rule $ run command $ = $ touch $ $out build $ continuations.txt $ : $ run shake-0.19.8/src/Test/Ninja/lexical.ninja0000644000000000000000000000020107346545000016275 0ustar0000000000000000 rule test.run command = echo ${foo.bar}$foo.bar > $out build lexical.txt: test.run foo.bar = XFoo_BarX foo = XFooX shake-0.19.8/src/Test/Ninja/lint.ninja0000644000000000000000000000054307346545000015633 0ustar0000000000000000 rule gen command = echo $out > $out rule run command = (echo $out : $out.gen > $out.d) && (echo $out > $out) depfile = $out.d build good: phony good1 good2 build good1: run || good1.gen build good1.gen: gen build good2: run || good2_phony build good2_phony: phony good2.gen build good2.gen: gen build bad: run | input build bad.gen: gen shake-0.19.8/src/Test/Ninja/nocreate.ninja0000644000000000000000000000012407346545000016460 0ustar0000000000000000 rule gen command = echo x >> nocreate.log build nocreate.out: gen nocreate.in shake-0.19.8/src/Test/Ninja/outputtouch.ninja0000644000000000000000000000011407346545000017262 0ustar0000000000000000 rule record command = echo hello > $out build outputtouch.txt: record shake-0.19.8/src/Test/Ninja/phonyorder.ninja0000644000000000000000000000025407346545000017055 0ustar0000000000000000 rule create command = touch $out rule copy command = cp $from $out build Foo2: phony || foo.txt build bar.txt: copy || Foo2 from = foo.txt build foo.txt: create shake-0.19.8/src/Test/Ninja/redefine.ninja0000644000000000000000000000031507346545000016443 0ustar0000000000000000 vGlobal = version1 rule record # should produce version3 version2 command = echo $vGlobal $vBuild > $out vGlobal = version2 build redefine.txt: record vBuild = $vGlobal vGlobal = version3 shake-0.19.8/src/Test/Ninja/restart.ninja0000644000000000000000000000020107346545000016340 0ustar0000000000000000 rule add command = echo build restart.txt: self >> $out build restart.ninja: add rule self command = echo $out > $out shake-0.19.8/src/Test/Ninja/subdir/0000755000000000000000000000000007346545000015132 5ustar0000000000000000shake-0.19.8/src/Test/Ninja/subdir/1.ninja0000644000000000000000000000012707346545000016313 0ustar0000000000000000# weirdly, Ninja includes are not relative to who includes them include subdir/2.ninja shake-0.19.8/src/Test/Ninja/subdir/2.ninja0000644000000000000000000000000007346545000016302 0ustar0000000000000000shake-0.19.8/src/Test/Ninja/test1.ninja0000644000000000000000000000007007346545000015720 0ustar0000000000000000 rule run command = touch $out build out1.txt: run shake-0.19.8/src/Test/Ninja/test2.ninja0000644000000000000000000000013007346545000015716 0ustar0000000000000000 rule run command = touch $out build out2.1: run build out2.2: run default out2.2 shake-0.19.8/src/Test/Ninja/test3-inc.ninja0000644000000000000000000000003407346545000016471 0ustar0000000000000000v5 = i1 build out3.3: dump shake-0.19.8/src/Test/Ninja/test3-sub.ninja0000644000000000000000000000005507346545000016514 0ustar0000000000000000v4 = s1 v5 = s1 build out3.4: dump v5 = s2 shake-0.19.8/src/Test/Ninja/test3.ninja0000644000000000000000000000034507346545000015727 0ustar0000000000000000 v1 = g1 v5 = g1 rule dump command = echo $v1+$v2+$v3+$v4+$v5 > $out v1 = g2 build out3.1: dump v2 = b1 v1 = g3 subninja test3-sub.ninja include test3-inc.ninja build out3.2: dump v1 = g4 include subdir/1.ninja shake-0.19.8/src/Test/Ninja/test4.ninja0000644000000000000000000000017207346545000015726 0ustar0000000000000000 rule run command = touch $out build ./out.txt: run build dir/../out2.txt: run build out: phony ./out.txt out2.txt shake-0.19.8/src/Test/Ninja/test5.ninja0000644000000000000000000000007407346545000015730 0ustar0000000000000000 rule run command = touch $out build output$ file: run shake-0.19.8/src/Test/Ninja/test6-inc.ninja0000644000000000000000000000001007346545000016466 0ustar0000000000000000v2 = g2 shake-0.19.8/src/Test/Ninja/test6-sub.ninja0000644000000000000000000000001007346545000016506 0ustar0000000000000000v2 = g3 shake-0.19.8/src/Test/Ninja/test6.ninja0000644000000000000000000000007107346545000015726 0ustar0000000000000000v2 = g1 include ${v1}-inc.ninja subninja ${v1}-sub.ninja shake-0.19.8/src/Test/Ninja/test7.ninja0000644000000000000000000000031007346545000015723 0ustar0000000000000000# #565, check multi-file rules that don't create their contents ninja_required_version = 1.5 build test : phony a b rule CUSTOM_COMMAND command = $COMMAND build a b : CUSTOM_COMMAND COMMAND = cd shake-0.19.8/src/Test/Oracle.hs0000644000000000000000000001012607346545000014344 0ustar0000000000000000{-# LANGUAGE TypeFamilies, ConstraintKinds, ScopedTypeVariables #-} {-# LANGUAGE DeriveDataTypeable, TypeOperators, GeneralizedNewtypeDeriving #-} {-# OPTIONS_GHC -Wno-orphans #-} module Test.Oracle(main) where import Development.Shake import Development.Shake.Classes import Development.Shake.FilePath import General.GetOpt import Data.List.Extra import Data.Tuple.Extra import Test.Type hiding (RandomType) import qualified Test.Type as T import Control.Monad import General.Extra -- These are instances we'll compute over type instance RuleResult String = String type instance RuleResult Int = String type instance RuleResult () = String type instance RuleResult Bool = Bool -- test results don't have to be a boolean newtype FileLen = FileLen FilePath deriving (Eq,Show,NFData,Typeable,Hashable,Binary) type instance RuleResult FileLen = Int newtype RandomType = RandomType (BinarySentinel String) deriving (Eq,Show,NFData,Typeable,Hashable,Binary) type instance RuleResult RandomType = Int type instance RuleResult T.RandomType = Int data Define = Define String String -- this type produces this result opt = [Option "" ["def"] (ReqArg (Right . uncurry Define . second tailErr . breakOn "=") "type=value") ""] main = testBuildArgs test opt $ \args -> do addOracle $ \(T.RandomType _) -> pure 42 addOracle $ \(RandomType _) -> pure (-42) "randomtype.txt" %> \out -> do a <- askOracle $ T.RandomType $ BinarySentinel () b <- askOracle $ RandomType $ BinarySentinel () writeFile' out $ show (a,b) addOracle $ \b -> pure $ not b "true.txt" %> \out -> writeFile' out . show =<< askOracle False let add :: forall a . (ShakeValue a, RuleResult a ~ String) => String -> a -> Rules () add name key = do name <.> "txt" %> \out -> do liftIO $ appendFile ".log" "." writeFile' out =<< askOracle key forM_ [val | Define nam val <- args, nam == name] $ \val -> addOracle $ \(_ :: a) -> pure val add "string" "" add "unit" () add "int" (0 :: Int) addOracleCache $ \(FileLen file) -> do liftIO $ appendFile ".log" "#" length <$> readFile' file "cache_out.txt" %> \out -> do o <- askOracle (FileLen "cache_in.txt") liftIO $ appendFile ".log" "!" writeFile' out $ show o test build = do build ["clean"] build ["randomtype.txt"] assertContents "randomtype.txt" "(42,-42)" -- check it rebuilds when it should writeFile ".log" "" build ["--def=string=name","string.txt"] assertContents "string.txt" "name" build ["--def=string=name","string.txt"] assertContents "string.txt" "name" build ["--def=string=test","string.txt"] assertContents "string.txt" "test" assertContents ".log" ".." -- check adding/removing redundant oracles does not trigger a rebuild build ["--def=string=test","string.txt","--def=unit=bob"] build ["--def=string=test","string.txt","--def=int=fred"] build ["--def=string=test","string.txt"] assertContents "string.txt" "test" assertContents ".log" ".." writeFile ".log" "" writeFile "cache_in.txt" "aaa" build ["cache_out.txt","--sleep"] assertContents "cache_out.txt" "3" assertContents ".log" "#!" writeFile "cache_in.txt" "zzz" build ["cache_out.txt","--sleep"] build ["cache_out.txt","--sleep"] assertContents "cache_out.txt" "3" assertContents ".log" "#!#" writeFile "cache_in.txt" "zzzz" build ["cache_out.txt","--sleep"] assertContents "cache_out.txt" "4" assertContents ".log" "#!##!" -- check error messages are good let errors args err = assertExceptionAfter (replace "\\" "/") err $ build $ "--quiet" : args build ["--def=unit=test","unit.txt"] errors ["unit.txt"] -- Building with an an Oracle that has been removed ["missing a call to addOracle"] errors ["int.txt"] -- Building with an Oracle that I know nothing about ["missing a call to addOracle"] errors ["--def=string=1","--def=string=1"] -- Two Oracles defined in one go ["oracle defined twice", "Test/Oracle.hs"] shake-0.19.8/src/Test/OrderOnly.hs0000644000000000000000000000302607346545000015055 0ustar0000000000000000 module Test.OrderOnly(main) where import Development.Shake import Test.Type import System.Directory(removeFile) import Control.Exception.Extra main = testBuild test $ do "bar.txt" %> \out -> do alwaysRerun writeFile' out =<< liftIO (readFile "bar.in") "foo.txt" %> \out -> do let src = "bar.txt" orderOnly [src] writeFile' out =<< liftIO (readFile src) need [src] "baz.txt" %> \out -> do let src = "bar.txt" orderOnly [src] liftIO $ appendFile out "x" "primary.txt" %> \out -> do need ["source.txt"] orderOnly ["intermediate.txt"] writeFile' out =<< liftIO (readFile "intermediate.txt") "intermediate.txt" %> \out -> copyFile' "source.txt" out test build = do writeFile "bar.in" "in" build ["foo.txt","--sleep"] assertContents "foo.txt" "in" writeFile "bar.in" "out" build ["foo.txt","--sleep"] assertContents "foo.txt" "out" writeFile "baz.txt" "" writeFile "bar.in" "in" build ["baz.txt","--sleep"] assertContents "baz.txt" "x" writeFile "bar.in" "out" build ["baz.txt"] assertContents "baz.txt" "x" ignore $ removeFile "intermediate.txt" writeFile "source.txt" "x" build ["primary.txt","--sleep"] assertContents "intermediate.txt" "x" removeFile "intermediate.txt" build ["primary.txt","--sleep"] assertMissing "intermediate.txt" writeFile "source.txt" "y" build ["primary.txt","--sleep"] assertContents "intermediate.txt" "y" shake-0.19.8/src/Test/Parallel.hs0000644000000000000000000000652407346545000014702 0ustar0000000000000000 module Test.Parallel(main) where import Development.Shake import Test.Type import Data.Foldable import Data.Tuple.Extra import Control.Monad import Control.Concurrent.Extra import Data.IORef main = testBuild test $ do "AB.txt" %> \out -> do -- need [obj "A.txt", obj "B.txt"] (text1,text2) <- readFile' "A.txt" `par` readFile' "B.txt" writeFile' out $ text1 ++ text2 sem <- liftIO $ newQSemN 0 "papplicative_*" %> \out -> do -- wait for both to do the initial start before continuing liftIO $ assertWithin 1 $ do signalQSemN sem 1 waitQSemN sem 3 signalQSemN sem 3 writeFile' out "" phony "papplicative" $ do need ["papplicative_1"] need ["papplicative_2"] let ensureReturn = pure () ensureReturn -- should work even though we have a pure need ["papplicative_3"] "pseparate_*" %> \out -> do liftIO $ appendFile "pseparate.log" "[" liftIO $ sleep 0.1 liftIO $ appendFile "pseparate.log" "]" writeFile' out "" phony "pseparate" $ do need ["pseparate_1"] liftIO $ pure () need ["pseparate_2"] sem <- liftIO $ newQSemN 0 "ptraverse_*" %> \out -> do -- wait for all to do the initial start before continuing liftIO $ assertWithin 1 $ do signalQSemN sem 1 waitQSemN sem 8 signalQSemN sem 8 writeFile' out "" phony "ptraverse" $ traverse_ (need . pure) ["ptraverse_" ++ show i | i <- [1..8]] phony "cancel" $ do writeFile' "cancel" "" done <- liftIO $ newIORef 0 lock <- liftIO newLock void $ parallel $ replicate 5 $ liftIO $ do x <- atomicModifyIORef done $ dupe . succ when (x == 3) $ do sleep 0.1; fail "boom" withLock lock $ appendFile "cancel" "x" phony "parallel" $ do active <- liftIO $ newIORef 0 peak <- liftIO $ newIORef 0 void $ parallel $ replicate 8 $ liftIO $ do now <- atomicModifyIORef active $ dupe . succ atomicModifyIORef peak $ dupe . max now sleep 0.1 atomicModifyIORef active $ dupe . pred peak <- liftIO $ readIORef peak writeFile' "parallel" $ show peak "parallels" %> \out -> do xs <- parallel $ replicate 5 $ parallel $ map pure [1..5] writeFile' out $ show xs phony "timings" $ void $ parallel $ map (liftIO . sleep) [1, 2, 0, 1] test build = do build ["clean"] writeFile "A.txt" "AAA" writeFile "B.txt" "BBB" build ["AB.txt","--sleep"] assertContents "AB.txt" "AAABBB" appendFile "A.txt" "aaa" build ["AB.txt"] assertContents "AB.txt" "AAAaaaBBB" assertException ["boom"] $ build ["cancel","-j1","--quiet"] assertContents "cancel" "xx" build ["parallel","-j1"] assertContents "parallel" "1" build ["parallel","-j5"] assertContents "parallel" "5" build ["parallels"] assertContents "parallels" $ show $ replicate 5 [1..5] writeFile "pseparate.log" "" build ["pseparate","-j2"] assertContents "pseparate.log" "[][]" build ["papplicative","-j3"] build ["ptraverse","-j8"] build ["timings","-j6"] assertTimings build [("timings",4)] build ["timings","-j1"] assertTimings build [("timings",4)] shake-0.19.8/src/Test/Pool.hs0000644000000000000000000000774207346545000014062 0ustar0000000000000000 module Test.Pool(main) where import Test.Type import General.Pool import Control.Concurrent.Extra import Control.Exception.Extra import Control.Monad import System.Time.Extra import Data.Either.Extra import General.Timing main = testSimple $ do -- See #474, we should never be running pool actions masked let add pool act = addPool PoolStart pool $ do Unmasked <- getMaskingState act forM_ [False,True] $ \deterministic -> do -- check that it aims for exactly the limit forM_ [1..6] $ \n -> do var <- newVar (0,0) -- (maximum, current) runPool deterministic n $ \pool -> replicateM_ 5 $ add pool $ do modifyVar_ var $ \(mx,now) -> pure (max (now+1) mx, now+1) -- requires that all tasks get spawned within 0.1s sleep 0.1 modifyVar_ var $ \(mx,now) -> pure (mx,now-1) res <- readVar var res === (min n 5, 0) -- check that exceptions are immediate good <- newVar True started <- newBarrier stopped <- newBarrier res <- try_ $ runPool deterministic 3 $ \pool -> do add pool $ do waitBarrier started throwIO Underflow add pool $ flip finally (signalBarrier stopped ()) $ do signalBarrier started () sleep 10 modifyVar_ good $ const $ pure False -- note that the pool finishing means we started killing our threads -- not that they have actually died mapLeft fromException res === Left (Just Underflow) waitBarrier stopped assertBoolIO (readVar good) "Must be true" -- check someone spawned when at zero todo still gets run done <- newBarrier runPool deterministic 1 $ \pool -> add pool $ add pool $ signalBarrier done () assertWithin 1 $ waitBarrier done -- check high priority stuff runs first res <- newVar "" runPool deterministic 1 $ \pool -> do let note c = modifyVar_ res $ pure . (c:) -- deliberately in a random order addPool PoolBatch pool $ note 'b' addPool PoolException pool $ note 'e' addPool PoolStart pool $ note 's' addPool PoolStart pool $ note 's' addPool PoolResume pool $ note 'r' addPool PoolException pool $ note 'e' (=== "bssree") =<< readVar res -- check that killing a thread pool stops the tasks, bug 545 -- and propagates the right exception thread <- newBarrier died <- newBarrier done <- newBarrier t <- flip forkFinally (signalBarrier died) $ runPool deterministic 1 $ \pool -> add pool $ flip onException (signalBarrier done ()) $ do flip throwTo Overflow =<< waitBarrier thread sleep 10 signalBarrier thread t assertWithin 1 $ waitBarrier done res <- assertWithin 1 $ waitBarrier died mapLeft fromException res === Left (Just Overflow) -- check that killing a thread pool aborts all threads before it returns started <- newBarrier var <- newVar False try_ $ runPool deterministic 2 $ \pool -> do add pool $ try_ $ (do signalBarrier started (); sleep 10) `finally` (do sleep 1; writeVar var True) add pool $ do waitBarrier started; throw Overflow (=== True) =<< readVar var -- benchmark for testing thread performance, see https://github.com/ndmitchell/shake/pull/751 when False $ do resetTimings withNumCapabilities 4 $ do (d, _) <- duration $ runPool False 4 $ \pool -> replicateM_ 200000 $ addPool PoolStart pool $ pure () print d print =<< getTimings shake-0.19.8/src/Test/Progress.hs0000644000000000000000000000413507346545000014746 0ustar0000000000000000module Test.Progress(main) where import Development.Shake.Internal.Progress import Development.Shake.Internal.Options import Test.Type import System.Directory.Extra import System.FilePath import General.Extra main = testBuild test $ pure () -- | Given a list of todo times, get out a list of how long is predicted prog = progEx 10000000000000000 progEx :: Double -> [Double] -> IO [Double] progEx mxDone todo = do let resolution = 10000 -- Use resolution to get extra detail on the numbers let done = scanl (+) 0 $ map (min mxDone . max 0) $ zipWith (-) todo (tailErr todo) let res = progressReplay $ zip (map (*resolution) [1..]) $ tailErr $ zipWith (\t d -> mempty{timeBuilt=d*resolution,timeTodo=(t*resolution,0)}) todo done pure $ (0/0) : map ((/ resolution) . actualSecs) res test build = do -- perfect functions should match perfectly xs <- prog [10,9..1] drop 2 xs === [8,7..1] xs <- prog $ map (*5) [10,9..1] drop 2 xs === [8,7..1] xs <- prog $ map (*0.2) [10,9..1] let dp3 x = fromIntegral (round $ x * 1000 :: Int) / 1000 map dp3 (drop 2 xs) === [8,7..1] -- The properties below this line could be weakened -- increasing functions can't match xs <- prog [5,6,7] last xs === 7 -- the first value must be plausible, or missing xs <- prog [187] assertBool (isNaN $ headErr xs) "No first value" -- desirable properties, could be weakened xs <- progEx 2 $ 100:map (*2) [10,9..1] drop 5 xs === [6,5..1] xs <- progEx 1 [10,9,100,8,7,6,5,4,3,2,1] assertBool (all ((<= 1.5) . abs) $ zipWith (-) (drop 5 xs) [6,5..1]) "Close" -- if no progress is made, don't keep the time going up xs <- prog [10,9,8,7,7,7,7,7] drop 5 xs === [7,7,7] -- if the work rate changes, should somewhat reflect that xs <- prog [10,9,8,7,6.5,6,5.5,5] assertBool (last xs > 7.1) "Some discounting (factor=0 would give 7)" xs <- listFiles $ shakeRoot "src/Test/Progress" build $ ["--progress=replay=" ++ x | x <- xs, takeExtension x == ".prog"] ++ ["--no-report","--report=-","--report=" ++ "progress.html"] shake-0.19.8/src/Test/Progress/0000755000000000000000000000000007346545000014407 5ustar0000000000000000shake-0.19.8/src/Test/Progress/progress-nan.prog0000644000000000000000000000137407346545000017723 0ustar0000000000000000(5.215521335601807,Progress {isFailure = Nothing, countSkipped = 5028, countBuilt = 0, countUnknown = 36224, countTodo = 1837, timeSkipped = 274149.05088049243, timeBuilt = 0.0, timeUnknown = 31144.710513075814, timeTodo = (4301.8477063977625,1)}) (12.997299194335938,Progress {isFailure = Nothing, countSkipped = 12154, countBuilt = 0, countUnknown = 23682, countTodo = 7253, timeSkipped = 274894.82577174786, timeBuilt = 0.0, timeUnknown = 23933.863898285897, timeTodo = (10766.919429932255,1)}) (18.471847534179688,Progress {isFailure = Nothing, countSkipped = 18273, countBuilt = 27, countUnknown = 13862, countTodo = 10931, timeSkipped = 275406.36595057615, timeBuilt = 1.7951794527471066, timeUnknown = 17710.551982904668, timeTodo = (16478.41014388157,4)}) shake-0.19.8/src/Test/Progress/self-clean-j2.prog0000644000000000000000000003047407346545000017632 0ustar0000000000000000(1.120224,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 254, countUnknown = 0, countTodo = 145, timeSkipped = 0.0, timeBuilt = 0.8861771221272647, timeUnknown = 0.0, timeTodo = (0.0,145)}) (2.4994998,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 258, countUnknown = 0, countTodo = 147, timeSkipped = 0.0, timeBuilt = 1.259251721901819, timeUnknown = 0.0, timeTodo = (0.0,147)}) (5.156031,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 259, countUnknown = 0, countTodo = 146, timeSkipped = 0.0, timeBuilt = 3.8557707152795047, timeUnknown = 0.0, timeTodo = (0.0,146)}) (6.4472423,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 261, countUnknown = 0, countTodo = 144, timeSkipped = 0.0, timeBuilt = 6.519303449196741, timeUnknown = 0.0, timeTodo = (0.0,144)}) (7.737371,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 262, countUnknown = 0, countTodo = 143, timeSkipped = 0.0, timeBuilt = 9.126777537865564, timeUnknown = 0.0, timeTodo = (0.0,143)}) (9.061503,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 267, countUnknown = 0, countTodo = 138, timeSkipped = 0.0, timeBuilt = 11.739121564431116, timeUnknown = 0.0, timeTodo = (0.0,138)}) (10.368634,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 268, countUnknown = 0, countTodo = 137, timeSkipped = 0.0, timeBuilt = 14.354383119149134, timeUnknown = 0.0, timeTodo = (0.0,137)}) (11.658763,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 272, countUnknown = 0, countTodo = 133, timeSkipped = 0.0, timeBuilt = 16.976646550698206, timeUnknown = 0.0, timeTodo = (0.0,133)}) (12.941892,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 274, countUnknown = 0, countTodo = 131, timeSkipped = 0.0, timeBuilt = 19.57990659098141, timeUnknown = 0.0, timeTodo = (0.0,131)}) (14.22602,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 279, countUnknown = 0, countTodo = 126, timeSkipped = 0.0, timeBuilt = 22.17116750101559, timeUnknown = 0.0, timeTodo = (0.0,126)}) (15.511148,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 282, countUnknown = 0, countTodo = 123, timeSkipped = 0.0, timeBuilt = 24.74242699961178, timeUnknown = 0.0, timeTodo = (0.0,123)}) (16.794277,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 283, countUnknown = 0, countTodo = 122, timeSkipped = 0.0, timeBuilt = 27.31368459085934, timeUnknown = 0.0, timeTodo = (0.0,122)}) (18.085405,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 288, countUnknown = 0, countTodo = 117, timeSkipped = 0.0, timeBuilt = 29.8999453864526, timeUnknown = 0.0, timeTodo = (0.0,117)}) (19.398537,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 294, countUnknown = 0, countTodo = 111, timeSkipped = 0.0, timeBuilt = 32.487208493752405, timeUnknown = 0.0, timeTodo = (0.0,111)}) (21.451742,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 295, countUnknown = 0, countTodo = 110, timeSkipped = 0.0, timeBuilt = 35.872547277016565, timeUnknown = 0.0, timeTodo = (0.0,110)}) (22.761873,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 297, countUnknown = 0, countTodo = 108, timeSkipped = 0.0, timeBuilt = 39.24288571695797, timeUnknown = 0.0, timeTodo = (0.0,108)}) (24.060003,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 298, countUnknown = 0, countTodo = 107, timeSkipped = 0.0, timeBuilt = 41.825143941445276, timeUnknown = 0.0, timeTodo = (0.0,107)}) (25.555153,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 304, countUnknown = 0, countTodo = 101, timeSkipped = 0.0, timeBuilt = 44.66642964701168, timeUnknown = 0.0, timeTodo = (0.0,101)}) (26.868284,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 305, countUnknown = 0, countTodo = 100, timeSkipped = 0.0, timeBuilt = 47.460708745522425, timeUnknown = 0.0, timeTodo = (0.0,100)}) (28.22642,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 306, countUnknown = 0, countTodo = 99, timeSkipped = 0.0, timeBuilt = 50.111974843544886, timeUnknown = 0.0, timeTodo = (0.0,99)}) (29.52355,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 308, countUnknown = 0, countTodo = 97, timeSkipped = 0.0, timeBuilt = 52.788244374794886, timeUnknown = 0.0, timeTodo = (0.0,97)}) (30.82868,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 309, countUnknown = 0, countTodo = 96, timeSkipped = 0.0, timeBuilt = 55.38850606302731, timeUnknown = 0.0, timeTodo = (0.0,96)}) (31.83378,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 309, countUnknown = 0, countTodo = 96, timeSkipped = 0.0, timeBuilt = 55.38850606302731, timeUnknown = 0.0, timeTodo = (0.0,96)}) (33.099907,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 311, countUnknown = 0, countTodo = 94, timeSkipped = 0.0, timeBuilt = 57.70873845438473, timeUnknown = 0.0, timeTodo = (0.0,94)}) (34.455044,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 312, countUnknown = 0, countTodo = 93, timeSkipped = 0.0, timeBuilt = 62.583227284951136, timeUnknown = 0.0, timeTodo = (0.0,93)}) (35.775173,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 323, countUnknown = 0, countTodo = 82, timeSkipped = 0.0, timeBuilt = 65.22149098734371, timeUnknown = 0.0, timeTodo = (0.0,82)}) (37.087307,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 324, countUnknown = 0, countTodo = 81, timeSkipped = 0.0, timeBuilt = 67.89576161722653, timeUnknown = 0.0, timeTodo = (0.0,81)}) (38.39644,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 332, countUnknown = 0, countTodo = 73, timeSkipped = 0.0, timeBuilt = 70.54603017191403, timeUnknown = 0.0, timeTodo = (0.0,73)}) (39.705566,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 336, countUnknown = 0, countTodo = 69, timeSkipped = 0.0, timeBuilt = 73.18129933695309, timeUnknown = 0.0, timeTodo = (0.0,69)}) (41.676765,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 339, countUnknown = 0, countTodo = 66, timeSkipped = 0.0, timeBuilt = 76.48663152079098, timeUnknown = 0.0, timeTodo = (0.0,66)}) (42.969894,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 342, countUnknown = 0, countTodo = 63, timeSkipped = 0.0, timeBuilt = 79.77195943216793, timeUnknown = 0.0, timeTodo = (0.0,63)}) (44.364033,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 345, countUnknown = 0, countTodo = 60, timeSkipped = 0.0, timeBuilt = 82.46222890238278, timeUnknown = 0.0, timeTodo = (0.0,60)}) (45.367134,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 348, countUnknown = 0, countTodo = 57, timeSkipped = 0.0, timeBuilt = 85.15849507669918, timeUnknown = 0.0, timeTodo = (0.0,57)}) (47.0483,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 351, countUnknown = 0, countTodo = 54, timeSkipped = 0.0, timeBuilt = 87.8587666831445, timeUnknown = 0.0, timeTodo = (0.0,54)}) (48.052402,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 351, countUnknown = 0, countTodo = 54, timeSkipped = 0.0, timeBuilt = 87.8587666831445, timeUnknown = 0.0, timeTodo = (0.0,54)}) (49.890587,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 354, countUnknown = 0, countTodo = 51, timeSkipped = 0.0, timeBuilt = 90.69304478983395, timeUnknown = 0.0, timeTodo = (0.0,51)}) (50.894688,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 354, countUnknown = 0, countTodo = 51, timeSkipped = 0.0, timeBuilt = 90.69304478983395, timeUnknown = 0.0, timeTodo = (0.0,51)}) (51.897785,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 354, countUnknown = 0, countTodo = 51, timeSkipped = 0.0, timeBuilt = 90.69304478983395, timeUnknown = 0.0, timeTodo = (0.0,51)}) (54.92909,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 357, countUnknown = 0, countTodo = 48, timeSkipped = 0.0, timeBuilt = 94.47342885355465, timeUnknown = 0.0, timeTodo = (0.0,48)}) (56.23222,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 358, countUnknown = 0, countTodo = 47, timeSkipped = 0.0, timeBuilt = 97.04368794779293, timeUnknown = 0.0, timeTodo = (0.0,47)}) (57.53035,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 360, countUnknown = 0, countTodo = 45, timeSkipped = 0.0, timeBuilt = 100.92107594828121, timeUnknown = 0.0, timeTodo = (0.0,45)}) (58.82748,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 361, countUnknown = 0, countTodo = 44, timeSkipped = 0.0, timeBuilt = 103.52433598856442, timeUnknown = 0.0, timeTodo = (0.0,44)}) (60.104607,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 368, countUnknown = 0, countTodo = 37, timeSkipped = 0.0, timeBuilt = 106.14560521463864, timeUnknown = 0.0, timeTodo = (0.0,37)}) (62.0268,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 369, countUnknown = 0, countTodo = 36, timeSkipped = 0.0, timeBuilt = 109.38093007425778, timeUnknown = 0.0, timeTodo = (0.0,36)}) (63.33293,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 372, countUnknown = 0, countTodo = 33, timeSkipped = 0.0, timeBuilt = 112.58624852518551, timeUnknown = 0.0, timeTodo = (0.0,33)}) (64.62606,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 374, countUnknown = 0, countTodo = 31, timeSkipped = 0.0, timeBuilt = 115.16150487284176, timeUnknown = 0.0, timeTodo = (0.0,31)}) (65.925186,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 379, countUnknown = 0, countTodo = 26, timeSkipped = 0.0, timeBuilt = 117.77476704935543, timeUnknown = 0.0, timeTodo = (0.0,26)}) (68.11641,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 387, countUnknown = 0, countTodo = 18, timeSkipped = 0.0, timeBuilt = 122.62825597147457, timeUnknown = 0.0, timeTodo = (0.0,18)}) (69.121506,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 387, countUnknown = 0, countTodo = 18, timeSkipped = 0.0, timeBuilt = 122.62825597147457, timeUnknown = 0.0, timeTodo = (0.0,18)}) (70.77867,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 390, countUnknown = 0, countTodo = 15, timeSkipped = 0.0, timeBuilt = 125.30251897196285, timeUnknown = 0.0, timeTodo = (0.0,15)}) (74.353035,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 393, countUnknown = 0, countTodo = 12, timeSkipped = 0.0, timeBuilt = 127.54874242166989, timeUnknown = 0.0, timeTodo = (0.0,12)}) (75.35613,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 396, countUnknown = 0, countTodo = 9, timeSkipped = 0.0, timeBuilt = 130.15401280741207, timeUnknown = 0.0, timeTodo = (0.0,9)}) (76.85028,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 399, countUnknown = 0, countTodo = 6, timeSkipped = 0.0, timeBuilt = 133.99539006571285, timeUnknown = 0.0, timeTodo = (0.0,6)}) (79.081505,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 402, countUnknown = 0, countTodo = 3, timeSkipped = 0.0, timeBuilt = 136.22862065653317, timeUnknown = 0.0, timeTodo = (0.0,3)}) (81.25272,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 404, countUnknown = 0, countTodo = 1, timeSkipped = 0.0, timeBuilt = 138.41584599833004, timeUnknown = 0.0, timeTodo = (0.0,1)}) (82.260826,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 404, countUnknown = 0, countTodo = 1, timeSkipped = 0.0, timeBuilt = 138.41584599833004, timeUnknown = 0.0, timeTodo = (0.0,1)}) (83.26492,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 404, countUnknown = 0, countTodo = 1, timeSkipped = 0.0, timeBuilt = 138.41584599833004, timeUnknown = 0.0, timeTodo = (0.0,1)}) (84.27003,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 404, countUnknown = 0, countTodo = 1, timeSkipped = 0.0, timeBuilt = 138.41584599833004, timeUnknown = 0.0, timeTodo = (0.0,1)}) (85.274124,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 404, countUnknown = 0, countTodo = 1, timeSkipped = 0.0, timeBuilt = 138.41584599833004, timeUnknown = 0.0, timeTodo = (0.0,1)}) (86.28023,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 404, countUnknown = 0, countTodo = 1, timeSkipped = 0.0, timeBuilt = 138.41584599833004, timeUnknown = 0.0, timeTodo = (0.0,1)}) (87.284325,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 404, countUnknown = 0, countTodo = 1, timeSkipped = 0.0, timeBuilt = 138.41584599833004, timeUnknown = 0.0, timeTodo = (0.0,1)}) (88.28843,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 404, countUnknown = 0, countTodo = 1, timeSkipped = 0.0, timeBuilt = 138.41584599833004, timeUnknown = 0.0, timeTodo = (0.0,1)}) shake-0.19.8/src/Test/Progress/self-rebuild-j2.prog0000644000000000000000000002753607346545000020203 0ustar0000000000000000(3.6053605,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 258, countUnknown = 1, countTodo = 147, timeSkipped = 0.0, timeBuilt = 1.054105345858261, timeUnknown = 0.0, timeTodo = (135.01951599121094,0)}) (4.8964896,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 259, countUnknown = 1, countTodo = 146, timeSkipped = 0.0, timeBuilt = 3.6373634978663176, timeUnknown = 0.0, timeTodo = (132.3802523612976,0)}) (6.189619,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 260, countUnknown = 1, countTodo = 145, timeSkipped = 0.0, timeBuilt = 6.2686262771021575, timeUnknown = 0.0, timeTodo = (129.85099935531616,0)}) (7.4677467,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 262, countUnknown = 1, countTodo = 143, timeSkipped = 0.0, timeBuilt = 8.854884450091049, timeUnknown = 0.0, timeTodo = (127.25373983383179,0)}) (8.742874,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 268, countUnknown = 1, countTodo = 137, timeSkipped = 0.0, timeBuilt = 11.444143597735092, timeUnknown = 0.0, timeTodo = (124.70048475265503,0)}) (10.016002,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 271, countUnknown = 1, countTodo = 134, timeSkipped = 0.0, timeBuilt = 16.5466540500056, timeUnknown = 0.0, timeTodo = (119.64597797393799,0)}) (11.30813,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 271, countUnknown = 1, countTodo = 134, timeSkipped = 0.0, timeBuilt = 16.5466540500056, timeUnknown = 0.0, timeTodo = (119.64597797393799,0)}) (12.59526,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 276, countUnknown = 1, countTodo = 129, timeSkipped = 0.0, timeBuilt = 19.123910252703354, timeUnknown = 0.0, timeTodo = (117.0657205581665,0)}) (13.876388,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 278, countUnknown = 1, countTodo = 127, timeSkipped = 0.0, timeBuilt = 21.708168332232162, timeUnknown = 0.0, timeTodo = (114.43845868110657,0)}) (15.173517,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 280, countUnknown = 1, countTodo = 125, timeSkipped = 0.0, timeBuilt = 24.27642471040599, timeUnknown = 0.0, timeTodo = (111.89620327949524,0)}) (16.456646,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 282, countUnknown = 1, countTodo = 123, timeSkipped = 0.0, timeBuilt = 26.85868198121898, timeUnknown = 0.0, timeTodo = (109.3659496307373,0)}) (17.726772,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 284, countUnknown = 1, countTodo = 121, timeSkipped = 0.0, timeBuilt = 29.442940060747787, timeUnknown = 0.0, timeTodo = (106.83069705963135,0)}) (19.027903,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 288, countUnknown = 1, countTodo = 117, timeSkipped = 0.0, timeBuilt = 32.00519592012279, timeUnknown = 0.0, timeTodo = (104.28944301605225,0)}) (20.345034,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 292, countUnknown = 1, countTodo = 113, timeSkipped = 0.0, timeBuilt = 34.586452786577865, timeUnknown = 0.0, timeTodo = (101.74619007110596,0)}) (21.648165,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 300, countUnknown = 1, countTodo = 105, timeSkipped = 0.0, timeBuilt = 37.32872611726634, timeUnknown = 0.0, timeTodo = (99.18993377685547,0)}) (23.717371,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 301, countUnknown = 1, countTodo = 104, timeSkipped = 0.0, timeBuilt = 40.75507003511302, timeUnknown = 0.0, timeTodo = (96.41965675354004,0)}) (25.022503,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 303, countUnknown = 1, countTodo = 102, timeSkipped = 0.0, timeBuilt = 44.12840873445384, timeUnknown = 0.0, timeTodo = (93.89340400695801,0)}) (26.289629,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 304, countUnknown = 1, countTodo = 101, timeSkipped = 0.0, timeBuilt = 46.6976664706599, timeUnknown = 0.0, timeTodo = (91.36415100097656,0)}) (27.576757,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 305, countUnknown = 1, countTodo = 100, timeSkipped = 0.0, timeBuilt = 49.27292472566478, timeUnknown = 0.0, timeTodo = (88.50186347961426,0)}) (29.49495,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 308, countUnknown = 1, countTodo = 97, timeSkipped = 0.0, timeBuilt = 52.47924644197337, timeUnknown = 0.0, timeTodo = (84.93850708007813,0)}) (31.11511,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 311, countUnknown = 1, countTodo = 94, timeSkipped = 0.0, timeBuilt = 56.04560119356029, timeUnknown = 0.0, timeTodo = (81.36915397644043,0)}) (32.42124,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 313, countUnknown = 1, countTodo = 92, timeSkipped = 0.0, timeBuilt = 58.938888852251694, timeUnknown = 0.0, timeTodo = (78.84289932250977,0)}) (33.68837,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 315, countUnknown = 1, countTodo = 90, timeSkipped = 0.0, timeBuilt = 61.51514465059154, timeUnknown = 0.0, timeTodo = (75.96561431884766,0)}) (34.961494,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 322, countUnknown = 1, countTodo = 83, timeSkipped = 0.0, timeBuilt = 64.10740501130931, timeUnknown = 0.0, timeTodo = (73.41636085510254,0)}) (36.245625,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 330, countUnknown = 1, countTodo = 75, timeSkipped = 0.0, timeBuilt = 66.67165786470287, timeUnknown = 0.0, timeTodo = (70.8551082611084,0)}) (37.545753,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 331, countUnknown = 1, countTodo = 74, timeSkipped = 0.0, timeBuilt = 69.22791320527904, timeUnknown = 0.0, timeTodo = (68.3188533782959,0)}) (38.944893,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 336, countUnknown = 1, countTodo = 69, timeSkipped = 0.0, timeBuilt = 71.91318160737865, timeUnknown = 0.0, timeTodo = (65.60857582092285,0)}) (40.345036,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 337, countUnknown = 1, countTodo = 68, timeSkipped = 0.0, timeBuilt = 74.72146255220287, timeUnknown = 0.0, timeTodo = (62.849300384521484,0)}) (41.651165,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 340, countUnknown = 1, countTodo = 65, timeSkipped = 0.0, timeBuilt = 77.43573409761302, timeUnknown = 0.0, timeTodo = (60.147029876708984,0)}) (42.9913,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 345, countUnknown = 1, countTodo = 60, timeSkipped = 0.0, timeBuilt = 80.11199981416576, timeUnknown = 0.0, timeTodo = (57.615779876708984,0)}) (44.28843,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 347, countUnknown = 1, countTodo = 58, timeSkipped = 0.0, timeBuilt = 82.76326781953685, timeUnknown = 0.0, timeTodo = (55.096527099609375,0)}) (45.844585,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 350, countUnknown = 1, countTodo = 55, timeSkipped = 0.0, timeBuilt = 85.57555038179271, timeUnknown = 0.0, timeTodo = (53.285343170166016,0)}) (46.848686,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 354, countUnknown = 1, countTodo = 51, timeSkipped = 0.0, timeBuilt = 88.44083816255443, timeUnknown = 0.0, timeTodo = (50.76008987426758,0)}) (47.852783,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 354, countUnknown = 1, countTodo = 51, timeSkipped = 0.0, timeBuilt = 88.44083816255443, timeUnknown = 0.0, timeTodo = (50.76008987426758,0)}) (50.848083,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 357, countUnknown = 1, countTodo = 48, timeSkipped = 0.0, timeBuilt = 92.16721374238841, timeUnknown = 0.0, timeTodo = (48.103824615478516,0)}) (52.852283,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 358, countUnknown = 1, countTodo = 47, timeSkipped = 0.0, timeBuilt = 95.5145495578181, timeUnknown = 0.0, timeTodo = (45.494564056396484,0)}) (54.130413,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 362, countUnknown = 1, countTodo = 43, timeSkipped = 0.0, timeBuilt = 100.13300735200755, timeUnknown = 0.0, timeTodo = (41.70818328857422,0)}) (55.41054,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 363, countUnknown = 1, countTodo = 42, timeSkipped = 0.0, timeBuilt = 102.67526275361888, timeUnknown = 0.0, timeTodo = (39.17192840576172,0)}) (56.414642,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 369, countUnknown = 1, countTodo = 36, timeSkipped = 0.0, timeBuilt = 105.25251800264232, timeUnknown = 0.0, timeTodo = (36.63368225097656,0)}) (59.073906,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 372, countUnknown = 1, countTodo = 33, timeSkipped = 0.0, timeBuilt = 107.67676574434154, timeUnknown = 0.0, timeTodo = (34.10843276977539,0)}) (60.356033,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 374, countUnknown = 1, countTodo = 31, timeSkipped = 0.0, timeBuilt = 110.21202117647044, timeUnknown = 0.0, timeTodo = (31.546173095703125,0)}) (61.641163,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 375, countUnknown = 1, countTodo = 30, timeSkipped = 0.0, timeBuilt = 114.02240401948802, timeUnknown = 0.0, timeTodo = (28.913909912109375,0)}) (62.92029,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 381, countUnknown = 1, countTodo = 24, timeSkipped = 0.0, timeBuilt = 116.60766250337474, timeUnknown = 0.0, timeTodo = (26.366653442382813,0)}) (65.09751,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 387, countUnknown = 1, countTodo = 18, timeSkipped = 0.0, timeBuilt = 121.40914184297435, timeUnknown = 0.0, timeTodo = (22.06322479248047,0)}) (66.10161,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 387, countUnknown = 1, countTodo = 18, timeSkipped = 0.0, timeBuilt = 121.40914184297435, timeUnknown = 0.0, timeTodo = (22.06322479248047,0)}) (67.678764,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 390, countUnknown = 1, countTodo = 15, timeSkipped = 0.0, timeBuilt = 123.98239928926341, timeUnknown = 0.0, timeTodo = (19.453964233398438,0)}) (71.20812,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 393, countUnknown = 1, countTodo = 12, timeSkipped = 0.0, timeBuilt = 126.19761687959544, timeUnknown = 0.0, timeTodo = (17.226730346679688,0)}) (72.21122,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 396, countUnknown = 1, countTodo = 9, timeSkipped = 0.0, timeBuilt = 128.785888974322, timeUnknown = 0.0, timeTodo = (14.163421630859375,0)}) (73.53935,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 399, countUnknown = 1, countTodo = 6, timeSkipped = 0.0, timeBuilt = 133.6913836642634, timeUnknown = 0.0, timeTodo = (11.642166137695313,0)}) (75.757576,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 402, countUnknown = 1, countTodo = 3, timeSkipped = 0.0, timeBuilt = 135.9266131564509, timeUnknown = 0.0, timeTodo = (8.530860900878906,0)}) (77.91779,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 404, countUnknown = 1, countTodo = 1, timeSkipped = 0.0, timeBuilt = 138.08883697236888, timeUnknown = 0.0, timeTodo = (6.776679992675781,0)}) (78.92289,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 404, countUnknown = 1, countTodo = 1, timeSkipped = 0.0, timeBuilt = 138.08883697236888, timeUnknown = 0.0, timeTodo = (6.776679992675781,0)}) (79.925995,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 404, countUnknown = 1, countTodo = 1, timeSkipped = 0.0, timeBuilt = 138.08883697236888, timeUnknown = 0.0, timeTodo = (6.776679992675781,0)}) (80.93109,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 404, countUnknown = 1, countTodo = 1, timeSkipped = 0.0, timeBuilt = 138.08883697236888, timeUnknown = 0.0, timeTodo = (6.776679992675781,0)}) (81.93419,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 404, countUnknown = 1, countTodo = 1, timeSkipped = 0.0, timeBuilt = 138.08883697236888, timeUnknown = 0.0, timeTodo = (6.776679992675781,0)}) (82.93929,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 404, countUnknown = 1, countTodo = 1, timeSkipped = 0.0, timeBuilt = 138.08883697236888, timeUnknown = 0.0, timeTodo = (6.776679992675781,0)}) shake-0.19.8/src/Test/Progress/self-zero-j2.prog0000644000000000000000000003006407346545000017522 0ustar0000000000000000(1.020102,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 254, countUnknown = 4, countTodo = 148, timeSkipped = 0.0, timeBuilt = 0.7620761110447347, timeUnknown = 2.0004000980407e-3, timeTodo = (146.13152742385864,0)}) (3.7443745,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 258, countUnknown = 1, countTodo = 147, timeSkipped = 0.0, timeBuilt = 1.0521051175892353, timeUnknown = 0.0, timeTodo = (145.76045322418213,0)}) (5.032503,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 260, countUnknown = 1, countTodo = 145, timeSkipped = 0.0, timeBuilt = 3.678367782384157, timeUnknown = 0.0, timeTodo = (143.16193342208862,0)}) (6.339634,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 261, countUnknown = 1, countTodo = 144, timeSkipped = 0.0, timeBuilt = 6.317631412297487, timeUnknown = 0.0, timeTodo = (140.56067323684692,0)}) (7.6067605,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 262, countUnknown = 1, countTodo = 143, timeSkipped = 0.0, timeBuilt = 8.911890674382448, timeUnknown = 0.0, timeTodo = (137.99141645431519,0)}) (8.867887,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 265, countUnknown = 1, countTodo = 140, timeSkipped = 0.0, timeBuilt = 11.489148784428835, timeUnknown = 0.0, timeTodo = (135.31988191604614,0)}) (10.124012,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 266, countUnknown = 1, countTodo = 139, timeSkipped = 0.0, timeBuilt = 14.01840179041028, timeUnknown = 0.0, timeTodo = (132.74862432479858,0)}) (11.394139,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 270, countUnknown = 1, countTodo = 135, timeSkipped = 0.0, timeBuilt = 16.545654464513063, timeUnknown = 0.0, timeTodo = (130.13927960395813,0)}) (12.661266,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 278, countUnknown = 1, countTodo = 127, timeSkipped = 0.0, timeBuilt = 19.09290998056531, timeUnknown = 0.0, timeTodo = (127.51380348205566,0)}) (13.928392,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 280, countUnknown = 1, countTodo = 125, timeSkipped = 0.0, timeBuilt = 21.638164687901735, timeUnknown = 0.0, timeTodo = (124.94054412841797,0)}) (15.19352,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 283, countUnknown = 1, countTodo = 122, timeSkipped = 0.0, timeBuilt = 24.176418472081423, timeUnknown = 0.0, timeTodo = (122.32628202438354,0)}) (16.462646,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 284, countUnknown = 1, countTodo = 121, timeSkipped = 0.0, timeBuilt = 26.708671737462282, timeUnknown = 0.0, timeTodo = (119.71102046966553,0)}) (17.723772,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 286, countUnknown = 1, countTodo = 119, timeSkipped = 0.0, timeBuilt = 29.24492471292615, timeUnknown = 0.0, timeTodo = (117.13076210021973,0)}) (18.9989,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 290, countUnknown = 1, countTodo = 115, timeSkipped = 0.0, timeBuilt = 31.782178092747927, timeUnknown = 0.0, timeTodo = (114.32748413085938,0)}) (20.265026,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 294, countUnknown = 1, countTodo = 111, timeSkipped = 0.0, timeBuilt = 34.328432250767946, timeUnknown = 0.0, timeTodo = (110.95114421844482,0)}) (21.525152,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 296, countUnknown = 1, countTodo = 109, timeSkipped = 0.0, timeBuilt = 36.87168710306287, timeUnknown = 0.0, timeTodo = (108.3658857345581,0)}) (22.792278,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 297, countUnknown = 1, countTodo = 108, timeSkipped = 0.0, timeBuilt = 39.39593904092908, timeUnknown = 0.0, timeTodo = (105.79362773895264,0)}) (24.09941,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 301, countUnknown = 1, countTodo = 104, timeSkipped = 0.0, timeBuilt = 41.934190917760134, timeUnknown = 0.0, timeTodo = (103.46539211273193,0)}) (25.552555,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 307, countUnknown = 1, countTodo = 98, timeSkipped = 0.0, timeBuilt = 44.71846978738904, timeUnknown = 0.0, timeTodo = (100.06105041503906,0)}) (27.156715,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 310, countUnknown = 1, countTodo = 95, timeSkipped = 0.0, timeBuilt = 47.48474519327283, timeUnknown = 0.0, timeTodo = (97.22976875305176,0)}) (28.427843,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 311, countUnknown = 1, countTodo = 94, timeSkipped = 0.0, timeBuilt = 50.347032714635134, timeUnknown = 0.0, timeTodo = (94.5565013885498,0)}) (29.692968,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 312, countUnknown = 1, countTodo = 93, timeSkipped = 0.0, timeBuilt = 53.222318816930056, timeUnknown = 0.0, timeTodo = (91.93223762512207,0)}) (30.960096,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 314, countUnknown = 1, countTodo = 91, timeSkipped = 0.0, timeBuilt = 55.76357476785779, timeUnknown = 0.0, timeTodo = (89.2559642791748,0)}) (32.22322,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 317, countUnknown = 1, countTodo = 88, timeSkipped = 0.0, timeBuilt = 58.305828262120485, timeUnknown = 0.0, timeTodo = (86.59769439697266,0)}) (33.50235,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 321, countUnknown = 1, countTodo = 84, timeSkipped = 0.0, timeBuilt = 60.847080398350954, timeUnknown = 0.0, timeTodo = (83.98942756652832,0)}) (34.50545,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 321, countUnknown = 1, countTodo = 84, timeSkipped = 0.0, timeBuilt = 60.847080398350954, timeUnknown = 0.0, timeTodo = (83.98942756652832,0)}) (35.778576,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 327, countUnknown = 1, countTodo = 78, timeSkipped = 0.0, timeBuilt = 64.42744653299451, timeUnknown = 0.0, timeTodo = (79.1019344329834,0)}) (37.039703,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 331, countUnknown = 1, countTodo = 74, timeSkipped = 0.0, timeBuilt = 67.99280183389783, timeUnknown = 0.0, timeTodo = (75.79260063171387,0)}) (38.30183,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 332, countUnknown = 1, countTodo = 73, timeSkipped = 0.0, timeBuilt = 70.5080529935658, timeUnknown = 0.0, timeTodo = (73.16133689880371,0)}) (39.60196,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 337, countUnknown = 1, countTodo = 68, timeSkipped = 0.0, timeBuilt = 73.04030625894666, timeUnknown = 0.0, timeTodo = (70.54007148742676,0)}) (40.9971,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 341, countUnknown = 1, countTodo = 64, timeSkipped = 0.0, timeBuilt = 75.7465783841908, timeUnknown = 0.0, timeTodo = (67.84580421447754,0)}) (42.251225,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 343, countUnknown = 1, countTodo = 62, timeSkipped = 0.0, timeBuilt = 78.44684617593884, timeUnknown = 0.0, timeTodo = (64.56347465515137,0)}) (43.51035,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 348, countUnknown = 1, countTodo = 57, timeSkipped = 0.0, timeBuilt = 80.97609345987439, timeUnknown = 0.0, timeTodo = (61.85720443725586,0)}) (45.31053,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 354, countUnknown = 1, countTodo = 51, timeSkipped = 0.0, timeBuilt = 85.3115312345326, timeUnknown = 0.0, timeTodo = (56.32666015625,0)}) (46.31663,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 354, countUnknown = 1, countTodo = 51, timeSkipped = 0.0, timeBuilt = 85.3115312345326, timeUnknown = 0.0, timeTodo = (56.32666015625,0)}) (47.953796,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 357, countUnknown = 1, countTodo = 48, timeSkipped = 0.0, timeBuilt = 87.96779649332166, timeUnknown = 0.0, timeTodo = (52.5462760925293,0)}) (49.22192,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 357, countUnknown = 1, countTodo = 48, timeSkipped = 0.0, timeBuilt = 87.96779649332166, timeUnknown = 0.0, timeTodo = (52.5462760925293,0)}) (50.483047,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 358, countUnknown = 1, countTodo = 47, timeSkipped = 0.0, timeBuilt = 90.48504655435681, timeUnknown = 0.0, timeTodo = (49.943016052246094,0)}) (51.742172,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 362, countUnknown = 1, countTodo = 43, timeSkipped = 0.0, timeBuilt = 94.27242677286267, timeUnknown = 0.0, timeTodo = (46.054622650146484,0)}) (53.0143,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 364, countUnknown = 1, countTodo = 41, timeSkipped = 0.0, timeBuilt = 96.79567353799939, timeUnknown = 0.0, timeTodo = (43.455360412597656,0)}) (54.279427,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 365, countUnknown = 1, countTodo = 40, timeSkipped = 0.0, timeBuilt = 99.32792680338025, timeUnknown = 0.0, timeTodo = (40.88410568237305,0)}) (55.549553,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 368, countUnknown = 1, countTodo = 37, timeSkipped = 0.0, timeBuilt = 101.87118165567517, timeUnknown = 0.0, timeTodo = (38.3088493347168,0)}) (56.90669,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 371, countUnknown = 1, countTodo = 34, timeSkipped = 0.0, timeBuilt = 104.48744599893689, timeUnknown = 0.0, timeTodo = (35.06652069091797,0)}) (58.179817,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 372, countUnknown = 1, countTodo = 33, timeSkipped = 0.0, timeBuilt = 107.11970918253064, timeUnknown = 0.0, timeTodo = (32.46625900268555,0)}) (59.453945,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 380, countUnknown = 1, countTodo = 25, timeSkipped = 0.0, timeBuilt = 109.69997041299939, timeUnknown = 0.0, timeTodo = (29.24193572998047,0)}) (61.17312,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 387, countUnknown = 1, countTodo = 18, timeSkipped = 0.0, timeBuilt = 114.0083963163197, timeUnknown = 0.0, timeTodo = (24.391448974609375,0)}) (62.17922,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 387, countUnknown = 1, countTodo = 18, timeSkipped = 0.0, timeBuilt = 114.0083963163197, timeUnknown = 0.0, timeTodo = (24.391448974609375,0)}) (63.787376,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 390, countUnknown = 1, countTodo = 15, timeSkipped = 0.0, timeBuilt = 116.61765687540174, timeUnknown = 0.0, timeTodo = (21.717185974121094,0)}) (67.25372,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 393, countUnknown = 1, countTodo = 12, timeSkipped = 0.0, timeBuilt = 118.84489076212049, timeUnknown = 0.0, timeTodo = (19.470962524414063,0)}) (69.114914,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 399, countUnknown = 1, countTodo = 6, timeSkipped = 0.0, timeBuilt = 124.42945497110486, timeUnknown = 0.0, timeTodo = (13.024314880371094,0)}) (70.91309,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 402, countUnknown = 1, countTodo = 3, timeSkipped = 0.0, timeBuilt = 127.54076020792127, timeUnknown = 0.0, timeTodo = (10.791084289550781,0)}) (72.67527,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 404, countUnknown = 1, countTodo = 1, timeSkipped = 0.0, timeBuilt = 129.2949411161244, timeUnknown = 0.0, timeTodo = (8.603858947753906,0)}) (73.68037,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 404, countUnknown = 1, countTodo = 1, timeSkipped = 0.0, timeBuilt = 129.2949411161244, timeUnknown = 0.0, timeTodo = (8.603858947753906,0)}) (74.683464,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 404, countUnknown = 1, countTodo = 1, timeSkipped = 0.0, timeBuilt = 129.2949411161244, timeUnknown = 0.0, timeTodo = (8.603858947753906,0)}) (75.68657,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 404, countUnknown = 1, countTodo = 1, timeSkipped = 0.0, timeBuilt = 129.2949411161244, timeUnknown = 0.0, timeTodo = (8.603858947753906,0)}) (76.68967,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 404, countUnknown = 1, countTodo = 1, timeSkipped = 0.0, timeBuilt = 129.2949411161244, timeUnknown = 0.0, timeTodo = (8.603858947753906,0)}) (77.69477,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 404, countUnknown = 1, countTodo = 1, timeSkipped = 0.0, timeBuilt = 129.2949411161244, timeUnknown = 0.0, timeTodo = (8.603858947753906,0)}) shake-0.19.8/src/Test/Random.hs0000644000000000000000000001175407346545000014367 0ustar0000000000000000{-# LANGUAGE LambdaCase #-} {-# LANGUAGE PatternGuards #-} module Test.Random(main) where import Development.Shake import Numeric.Extra import Test.Type import Control.Exception.Extra import Control.Monad import Data.List import Data.Maybe import General.GetOpt import System.Environment import System.Exit import System.Random import General.Extra import qualified System.IO.Extra as IO import System.Time.Extra inputRange = [1..10] data Value = Single Int | Multiple [[Value]] deriving (Read,Show,Eq) data Source = Input Int | Output Int | Bang deriving (Read,Show) data Logic = Logic Int [[Source]] | Want [Int] deriving (Read,Show) arg = [Option "" ["arg"] (ReqArg Right "") ""] main = testBuildArgs test arg $ \args -> do let toFile (Input i) = "input-" ++ show i ++ ".txt" toFile (Output i) = "output-" ++ show i ++ ".txt" toFile Bang = error "BANG" let randomSleep = liftIO $ do i <- randomRIO (0, 25) sleep $ intToDouble i / 100 forM_ (map read $ filter (isNothing . asDuration) args) $ \case Want xs -> want $ map (toFile . Output) xs Logic out srcs -> toFile (Output out) %> \out -> do res <- fmap (show . Multiple) $ forM srcs $ \src -> do randomSleep need $ map toFile src mapM (liftIO . fmap read . IO.readFile' . toFile) src randomSleep writeFileChanged out res asDuration :: String -> Maybe Double asDuration x | "s" `isSuffixOf` x, [(i,"")] <- reads $ init x = Just i | "m" `isSuffixOf` x, [(i,"")] <- reads $ init x = Just $ i * 60 | otherwise = Nothing test build = do limit <- do args <- getArgs let bound = listToMaybe $ reverse $ mapMaybe asDuration args time <- offsetTime pure $ when (isJust bound) $ do now <- time when (now > fromJust bound) exitSuccess forM_ [1..] $ \count -> do limit putStrLn $ "* PERFORMING RANDOM TEST " ++ show count build ["clean"] build [] -- to create the directory forM_ inputRange $ \i -> writeFile ("input-" ++ show i ++ ".txt") $ show $ Single i logic <- randomLogic runLogic [] logic chng <- filterM (const randomIO) inputRange forM_ chng $ \i -> writeFile ("input-" ++ show i ++ ".txt") $ show $ Single $ negate i runLogic chng logic forM_ inputRange $ \i -> writeFile ("input-" ++ show i ++ ".txt") $ show $ Single i logicBang <- addBang =<< addBang logic j <- randomRIO (1::Int,8) res <- try_ $ build $ "--exception" : ("-j" ++ show j) : map ((++) "--arg=" . show) (logicBang ++ [Want [i | Logic i _ <- logicBang]]) case res of Left err | "BANG" `isInfixOf` show err -> pure () -- error I expected | otherwise -> error $ "UNEXPECTED ERROR: " ++ show err _ -> pure () -- occasionally we only put BANG in places with no dependenies that don't get rebuilt runLogic [] $ logic ++ [Want [i | Logic i _ <- logic]] where runLogic :: [Int] -> [Logic] -> IO () runLogic negated xs = do let poss = [i | Logic i _ <- xs] i <- randomRIO (0, 7) wants <- replicateM i $ do i <- randomRIO (0, 5) replicateM i $ randomElem poss sleepFileTime j <- randomRIO (1::Int,8) build $ ("-j" ++ show j) : map ((++) "--arg=" . show) (xs ++ map Want wants) let value i = let ys = headErr [ys | Logic j ys <- xs, j == i] in Multiple $ flip map ys $ map $ \case Input i -> Single $ if i `elem` negated then negate i else i Output i -> value i Bang -> error "BANG" forM_ (concat wants) $ \i -> do let wanted = value i got <- fmap read $ IO.readFile' $ "output-" ++ show i ++ ".txt" when (wanted /= got) $ error $ "INCORRECT VALUE for " ++ show i addBang :: [Logic] -> IO [Logic] addBang xs = do i <- randomRIO (0, length xs - 1) let (before,now:after) = splitAt i xs now <- f now pure $ before ++ now : after where f (Logic log xs) = do i <- randomRIO (0, length xs) let (before,after) = splitAt i xs pure $ Logic log $ before ++ [Bang] : after f x = pure x randomLogic :: IO [Logic] -- only Logic constructors randomLogic = do rules <- randomRIO (1,100) f rules $ map Input inputRange where f 0 _ = pure [] f i avail = do needs <- randomRIO (0,3) xs <- replicateM needs $ do ns <- randomRIO (0,3) replicateM ns $ randomElem avail let r = Logic i xs (r:) <$> f (i-1) (Output i:avail) shake-0.19.8/src/Test/Rebuild.hs0000644000000000000000000000422707346545000014532 0ustar0000000000000000 module Test.Rebuild(main) where import Development.Shake import Test.Type import Text.Read import Data.List.Extra import Control.Monad import General.GetOpt data Opt = Timestamp String | Pattern Pat opts = [Option "" ["timestamp"] (ReqArg (Right . Timestamp) "VALUE") "Value used to detect what has rebuilt when" ,Option "" ["pattern"] (ReqArg (fmap Pattern . readEither) "PATTERN") "Which file rules to use (%>, &?> etc)"] main = testBuildArgs test opts $ \args -> do let timestamp = concat [x | Timestamp x <- args] let p = lastDef PatWildcard [x | Pattern x <- args] want ["a.txt"] pat p "a.txt" $ \out -> do src <- readFile' "b.txt" writeFile' out $ src ++ timestamp pat p "b.txt" $ \out -> do src <- readFile' "c.txt" writeFile' out $ src ++ timestamp test build = forM_ [minBound..maxBound :: Pat] $ \pat -> do build ["clean"] let go arg c b a flags = do writeFileChanged "c.txt" c build $ ["--timestamp=" ++ arg, "--sleep","--no-reports","--pattern=" ++ show pat] ++ flags assertContents "b.txt" b assertContents "a.txt" a -- check rebuild works go "1" "x" "x1" "x11" [] go "2" "x" "x1" "x11" [] go "3" "x" "x1" "x13" ["--rebuild=a.*"] go "4" "x" "x1" "x13" [] go "5" "x" "x5" "x55" ["--rebuild=b.*"] go "6" "x" "x6" "x66" ["--rebuild"] go "7" "x" "x6" "x66" [] go "8" "y" "y8" "y88" [] -- check skip works go "1" "x" "x1" "x11" [] go "2" "y" "y2" "x11" ["--skip=a.*"] go "3" "y" "y2" "y23" [] go "4" "z" "y2" "y23" ["--skip=b.*"] go "5" "z" "y2" "y23" ["--skip=b.*"] go "6" "z" "z6" "z66" [] go "7" "a" "z6" "z66" ["--skip=c.*"] go "8" "a" "z6" "z66" ["--skip=b.*"] go "9" "a" "a9" "z66" ["--skip=a.*"] go "0" "a" "a9" "a90" [] {- -- check skip-forever works -- currently it does not work properly go "1" "x" "x1" "x11" [] go "2" "y" "y2" "x11" ["--skip-forever=a.*"] go "3" "y" "y2" "x11" [] go "4" "z" "z4" "z44" [] -} shake-0.19.8/src/Test/Reschedule.hs0000644000000000000000000000110307346545000015215 0ustar0000000000000000 module Test.Reschedule(main) where import Development.Shake import Test.Type main = testBuild test $ do file <- newResource "log.txt" 1 let log x = withResource file 1 $ liftIO $ appendFile "log.txt" x "*.p0" %> \out -> do log "0" writeFile' out "" "*.p1" %> \out -> do reschedule 1 log "1" writeFile' out "" "*.p2" %> \out -> do reschedule 2 log "2" writeFile' out "" test build = do build ["clean"] build ["foo.p1","bar.p1","baz.p0","qux.p2"] assertContents "log.txt" "0211" shake-0.19.8/src/Test/Resources.hs0000644000000000000000000000576107346545000015122 0ustar0000000000000000 module Test.Resources(main) where import Development.Shake import Test.Type import Data.List.Extra import System.FilePath import Control.Exception.Extra import System.Time.Extra import Control.Monad import Data.IORef main = testBuild test $ do -- test I have good Ord and Show do r1 <- newResource "test" 2 r2 <- newResource "special" 67 unless (r1 < r2 || r2 < r1) $ fail "Resources should have a good ordering" unless ("special" `isInfixOf` show r2) $ fail "Resource should contain their name when shown" -- test you are capped to a maximum value do let cap = 2 inside <- liftIO $ newIORef 0 res <- newResource "test" cap phony "cap" $ need ["c_file" ++ show i ++ ".txt" | i <- [1..4]] "c_*.txt" %> \out -> withResource res 1 $ do old <- liftIO $ atomicModifyIORef inside $ \i -> (i+1,i) when (old >= cap) $ fail "Too many resources in use at one time" liftIO $ sleep 0.1 liftIO $ atomicModifyIORef inside $ \i -> (i-1,i) writeFile' out "" -- test things can still run while you are blocked on a resource do done <- liftIO $ newIORef 0 lock <- newResource "lock" 1 phony "schedule" $ need $ map ("s_" ++) $ "lock1":"done":["free" ++ show i | i <- [1..10]] ++ ["lock2"] "s_done" %> \out -> do need ["s_lock1","s_lock2"] done <- liftIO $ readIORef done when (done < 10) $ fail "Not all managed to schedule while waiting" writeFile' out "" "s_lock*" %> \out -> do withResource lock 1 $ liftIO $ sleep 0.5 writeFile' out "" "s_free*" %> \out -> do liftIO $ atomicModifyIORef done $ \i -> (i+1,()) writeFile' out "" -- test that throttle works properly do res <- newThrottle "throttle" 2 0.4 phony "throttle" $ need ["t_file1.1","t_file2.1","t_file3.2","t_file4.1","t_file5.2"] "t_*.*" %> \out -> do withResource res (read $ drop1 $ takeExtension out) $ when (takeBaseName out == "t_file3") $ liftIO $ sleep 0.2 writeFile' out "" test build = do build ["-j2","cap","--clean"] build ["-j4","cap","--clean"] build ["-j10","cap","--clean"] build ["-j2","schedule","--clean"] forM_ ["-j1","-j8"] $ \flags -> -- we are sometimes over the window if the machine is "a bit loaded" at some particular time -- therefore we rerun the test three times, and only fail if it fails on all of them retry 3 $ do (s, _) <- duration $ build [flags,"throttle","--no-report","--clean"] -- the 0.1s cap is a guess at an upper bound for how long everything else should take -- and should be raised on slower machines assertBool (s >= 1.4 && s < 1.8) $ "Bad throttling, expected to take 1.4s + computation time (cap of 0.4s), took " ++ show s ++ "s" shake-0.19.8/src/Test/Self.hs0000644000000000000000000000645407346545000014041 0ustar0000000000000000{-# LANGUAGE GeneralizedNewtypeDeriving, DeriveDataTypeable, TypeFamilies #-} module Test.Self(main, cabalBuildDepends) where import Development.Shake import Development.Shake.Classes import Development.Shake.FilePath import Test.Type import Control.Monad.Extra import Data.Char import Data.List.Extra import System.Info.Extra newtype GhcPkg = GhcPkg () deriving (Show,Typeable,Eq,Hashable,Binary,NFData) newtype GhcFlags = GhcFlags () deriving (Show,Typeable,Eq,Hashable,Binary,NFData) type instance RuleResult GhcPkg = [String] type instance RuleResult GhcFlags = [String] -- Doesn't work on CI, seems self-building under `cabal test` is just hard main = testBuild (notCI . defaultTest) $ do let moduleToFile ext xs = replace "." "/" xs <.> ext want ["Main" <.> exe] -- fixup to cope with Cabal's generated files let fixPaths x = if x == "Paths_shake.hs" then "Paths.hs" else x ghcPkg <- addOracleHash $ \GhcPkg{} -> do Stdout out <- quietly $ cmd "ghc-pkg list --simple-output" pure $ words out ghcFlags <- addOracleHash $ \GhcFlags{} -> map ("-package=" ++) <$> readFileLines ".pkgs" let ghc args = do trackAllow ["**/package.cache", "**/.ghc.environment.*"] -- since ghc-pkg includes the ghc package, it changes if the version does ghcPkg $ GhcPkg () flags <- ghcFlags $ GhcFlags () cmd "ghc" flags args "Main" <.> exe %> \out -> do src <- readFileLines "Run.deps" let os = map (moduleToFile "o") $ "Run" : src need os ghc $ ["-o",out] ++ os "**/*.deps" %> \out -> do dep <- readFileLines $ out -<.> "dep" let xs = map (moduleToFile "deps") dep need xs ds <- nubOrd . sort . (++) dep <$> concatMapM readFileLines xs writeFileLines out ds "**/*.dep" %> \out -> do src <- readFile' $ shakeRoot "src" fixPaths (out -<.> "hs") let xs = hsImports src xs <- filterM (doesFileExist . (\x -> shakeRoot "src" x) . fixPaths . moduleToFile "hs") xs writeFileLines out xs ["**/*.o","**/*.hi"] &%> \[out,_] -> do deps <- readFileLines $ out -<.> "deps" let hs = shakeRoot "src" fixPaths (out -<.> "hs") need $ hs : map (moduleToFile "hi") deps ghc ["-c",hs,"-i" ++ shakeRoot "src","-main-is","Run.main" ,"-hide-all-packages","-outputdir=." ,"-DPORTABLE"] -- to test one CPP branch ".pkgs" %> \out -> do src <- readFile' $ shakeRoot "shake.cabal" writeFileLines out $ sort $ cabalBuildDepends src --------------------------------------------------------------------- -- GRAB INFORMATION FROM FILES hsImports :: String -> [String] hsImports xs = [ takeWhile (\x -> isAlphaNum x || x `elem` "._") $ dropWhile (not . isUpper) x | x <- concatMap (wordsBy (== ';')) $ lines xs, "import " `isPrefixOf` trim x] -- FIXME: Should actually parse the list from the contents of the .cabal file cabalBuildDepends :: String -> [String] cabalBuildDepends _ = packages ++ ["unix" | not isWindows] packages = words $ "base transformers binary unordered-containers hashable heaps time bytestring primitive " ++ "filepath directory process deepseq random utf8-string extra js-dgtable js-jquery js-flot filepattern" shake-0.19.8/src/Test/SelfMake.hs0000644000000000000000000000376107346545000014635 0ustar0000000000000000{-# LANGUAGE GeneralizedNewtypeDeriving, DeriveDataTypeable, TypeFamilies #-} module Test.SelfMake(main) where import Development.Shake import Development.Shake.Classes import Development.Shake.FilePath import Development.Shake.Util import Test.Self(cabalBuildDepends) import Test.Type import Data.List.Extra newtype GhcPkg = GhcPkg () deriving (Show,Typeable,Eq,Hashable,Binary,NFData) newtype GhcFlags = GhcFlags () deriving (Show,Typeable,Eq,Hashable,Binary,NFData) type instance RuleResult GhcPkg = [String] type instance RuleResult GhcFlags = [String] -- Doesn't work on CI, seems self-building under `cabal test` is just hard main = testBuild (notCI . defaultTest) $ do want ["Main" <.> exe] ghcPkg <- addOracleHash $ \GhcPkg{} -> do Stdout out <- quietly $ cmd "ghc-pkg list --simple-output" pure $ words out ghcFlags <- addOracleHash $ \GhcFlags{} -> map ("-package=" ++) <$> readFileLines ".pkgs" let ghc args = do trackAllow ["**/package.cache", "**/.ghc.environment.*"] -- since ghc-pkg includes the ghc package, it changes if the version does ghcPkg $ GhcPkg () flags <- ghcFlags $ GhcFlags () cmd "ghc" flags args "Main" <.> exe %> \out -> do let run = shakeRoot "src/Run.hs" copyFileChanged (shakeRoot "src" "Paths.hs") "Paths_shake.hs" let flags = ["-i" ++ shakeRoot "src","-dep-suffix=.","-main-is","Run.main" ,"-hide-all-packages","-outputdir=." ,"-DPORTABLE","-fwarn-unused-imports","-Werror"] -- to test one CPP branch trackAllow ["**/*.o","**/*.hi","Makefile"] ghc $ ["-M",run] ++ flags need . concatMap (filter (\x -> takeExtension x == ".hs") . snd) . parseMakefile =<< liftIO (readFile "Makefile") ghc $ ["-o",out,run,"-j4"] ++ flags ".pkgs" %> \out -> do src <- readFile' $ shakeRoot "shake.cabal" writeFileLines out $ sort $ cabalBuildDepends src shake-0.19.8/src/Test/Tar.hs0000644000000000000000000000053407346545000013667 0ustar0000000000000000 module Test.Tar(main) where import Development.Shake import System.FilePath import Test.Type main = testBuild defaultTest $ do want ["result.tar"] "result.tar" %> \out -> do contents <- fmap (map (shakeRoot )) $ readFileLines $ shakeRoot "src/Test/Tar/list.txt" need contents cmd "tar -cf" [out] contents shake-0.19.8/src/Test/Tar/0000755000000000000000000000000007346545000013331 5ustar0000000000000000shake-0.19.8/src/Test/Tar/list.txt0000644000000000000000000000006107346545000015042 0ustar0000000000000000src/Test/Tar.hs src/Run.hs src/Test/Tar/list.txt shake-0.19.8/src/Test/Targets.hs0000644000000000000000000000317407346545000014555 0ustar0000000000000000module Test.Targets(main) where import Development.Shake import Development.Shake.Internal.Core.Rules (getHelpSuffix) import Test.Type main :: IO () -> IO () main _sleeper = do targets <- getTargets shakeOptions rules targets === expected helpSuffix <- getHelpSuffix shakeOptions rules helpSuffix === ["Don't Panic", "Know where your towel is"] rules :: Rules () rules = do withTargetDocs "A phony target" $ phony "phony1" $ pure () "file1" %> \_ -> pure () ["file2", "file3"] |%> \_ -> pure () ["file4", "file5"] &%> \_ -> pure () "file6" %> \_ -> pure () ["file7", "file8"] |%> \_ -> pure () ["file9", "file10"] &%> \_ -> pure () withTargetDocs "Builds something really good" $ phony "phony2" $ pure () withTargetDocs "bad docs" $ do withTargetDocs "a great file" $ "file11" %> \_ -> pure () withTargetDocs "awesome files" $ ["file12", "file13"] &%> \_ -> pure () phony "Foo" $ pure () withoutTargets $ phony "Bar" $ pure () addHelpSuffix "Don't Panic" addHelpSuffix "Know where your towel is" expected :: [(String, Maybe String)] expected = [ "phony1" * Just "A phony target" , "file1" * Nothing , "file2" * Nothing , "file3" * Nothing , "file4" * Nothing , "file5" * Nothing , "file6" * Nothing , "file7" * Nothing , "file8" * Nothing , "file9" * Nothing , "file10" * Nothing , "phony2" * Just "Builds something really good" , "file11" * Just "a great file" , "file12" * Just "awesome files" , "file13" * Just "awesome files" , "Foo" * Just "bad docs" ] where (*) = (,) shake-0.19.8/src/Test/Thread.hs0000644000000000000000000000415607346545000014354 0ustar0000000000000000{-# LANGUAGE ScopedTypeVariables, TupleSections #-} module Test.Thread(main) where import General.Cleanup import General.Thread import Control.Exception.Extra import Control.Concurrent.Extra import Data.Either.Extra import Data.IORef import Test.Type main = testSimple $ do ref <- newIORef 0 let finish = atomicModifyIORef ref $ \x -> (x+1, ()) let finished want = do got <- atomicModifyIORef ref (0,) want === got pauser <- newEmptyMVar let pause = takeMVar pauser let unpause = putMVar pauser () let isAnswer x act = do r <- assertWithin 1 $ try_ act mapLeft show r === Right x let isException x act = do r <- assertWithin 1 $ try_ act mapLeft fromException r === Left (Just x) putStrLn "## allocateThread, spanwed finishes first" isAnswer 1 $ withCleanup $ \cleanup -> do allocateThread cleanup finish sleep 0.1 pure 1 finished 1 putStrLn "## allocateThread, main finishes first" isAnswer 1 $ withCleanup $ \cleanup -> do allocateThread cleanup $ (unpause >> sleep 100) `finally` finish pause pure 1 finished 1 putStrLn "## allocateThread, spawned throws an exception" isException Overflow $ withCleanup $ \cleanup -> do allocateThread cleanup $ pause >> throw Overflow (unpause >> sleep 100) `finally` finish finished 1 putStrLn "## allocateThread, main throws an exception" isException Overflow $ withCleanup $ \cleanup -> do allocateThread cleanup $ (unpause >> sleep 100) `finally` finish pause throw Overflow pure 1 finished 1 putStrLn "## withThreadsBoth, both succeed" isAnswer (2,3) $ withThreadsBoth (pure 2) (pure 3) putStrLn "## withThreadsBoth, left fails" isException Overflow $ withThreadsBoth (pause >> throw Overflow >> pure 1) ((unpause >> pure 3) `finally` finish) finished 1 putStrLn "## withThreadsBoth, right fails" isException Overflow $ withThreadsBoth ((unpause >> pure 3) `finally` finish) (pause >> throw Overflow >> pure 1) finished 1 shake-0.19.8/src/Test/Tup.hs0000644000000000000000000000300107346545000013701 0ustar0000000000000000 module Test.Tup(main) where import Development.Shake import Development.Shake.Config import Development.Shake.FilePath import Development.Shake.Util import Test.Type import Data.Maybe import Control.Monad import System.Info.Extra -- Running ar on Mac seems to break in CI - not sure why main = testBuild (unless isMac . defaultTest) $ do -- Example inspired by http://gittup.org/tup/ex_multiple_directories.html usingConfigFile $ shakeRoot "src/Test/Tup/root.cfg" action $ do keys <- getConfigKeys need [x -<.> exe | x <- keys, takeExtension x == ".exe"] let objects dir key = do let f x | takeExtension x == ".c" = dir x -<.> "o" | takeExtension x == ".a" = takeBaseName x "lib" ++ x | otherwise = error $ "Unknown extension, " ++ x x <- fromMaybe (error $ "Missing config key, " ++ key) <$> getConfig key pure $ map f $ words x (\x -> x -<.> exe == x) ?> \out -> do os <- objects "" $ takeBaseName out <.> "exe" need os cmd "gcc" os "-o" [out] "//lib*.a" %> \out -> do os <- objects (drop 3 $ takeBaseName out) $ drop 3 $ takeFileName out need os cmd "ar crs" [out] os "//*.o" %> \out -> do let src = shakeRoot "src/Test/Tup" out -<.> "c" need [src] cmd_ "gcc -c -MMD -MF" [out -<.> "d"] [src] "-o" [out] "-O2 -Wall" ["-I" ++ shakeRoot "src/Test/Tup/newmath"] neededMakefileDependencies $ out -<.> "d" shake-0.19.8/src/Test/Tup/0000755000000000000000000000000007346545000013353 5ustar0000000000000000shake-0.19.8/src/Test/Tup/hello.c0000644000000000000000000000023107346545000014616 0ustar0000000000000000#include #include "square.h" int main(void) { printf("Hi, everybody!\n"); printf("Five squared is: %i\n", square(5)); return 0; } shake-0.19.8/src/Test/Tup/newmath/0000755000000000000000000000000007346545000015016 5ustar0000000000000000shake-0.19.8/src/Test/Tup/newmath/root.cfg0000644000000000000000000000002607346545000016460 0ustar0000000000000000 newmath.a = square.c shake-0.19.8/src/Test/Tup/newmath/square.c0000644000000000000000000000007507346545000016464 0ustar0000000000000000#include "square.h" int square(int x) { return x * x; } shake-0.19.8/src/Test/Tup/newmath/square.h0000644000000000000000000000002307346545000016462 0ustar0000000000000000int square(int x); shake-0.19.8/src/Test/Tup/root.cfg0000644000000000000000000000011407346545000015013 0ustar0000000000000000 hello.exe = hello.c newmath.a include ../../src/Test/Tup/newmath/root.cfg shake-0.19.8/src/Test/Type.hs0000644000000000000000000003230607346545000014064 0ustar0000000000000000{-# LANGUAGE GeneralizedNewtypeDeriving, DeriveDataTypeable, ScopedTypeVariables #-} module Test.Type( sleep, sleepFileTime, sleepFileTimeCalibrate, testBuildArgs, testBuild, testSimple, testNone, shakeRoot, defaultTest, hasTracker, notCI, notWindowsCI, notMacCI, copyDirectoryChanged, copyFileChangedIO, assertWithin, assertBool, assertBoolIO, assertException, assertExceptionAfter, assertContents, assertContentsUnordered, assertContentsWords, assertContentsInfix, assertExists, assertMissing, assertTimings, (===), (&?%>), Pat(PatWildcard), pat, BinarySentinel(..), RandomType(..), ) where import Development.Shake import Development.Shake.Classes import Development.Shake.Forward import Development.Shake.Internal.FileName import General.Extra import Development.Shake.Internal.FileInfo import Development.Shake.FilePath import Development.Shake.Internal.Paths import Control.Exception.Extra import Control.Monad.Extra import Data.List.Extra import Text.Read(readMaybe) import Data.Maybe import Data.Either import Data.Typeable import System.Directory.Extra as IO import System.Environment import System.Random import General.GetOpt import System.IO.Extra as IO import System.Time.Extra import System.Info.Extra testBuildArgs :: (([String] -> IO ()) -> IO ()) -- ^ The test driver -> [OptDescr (Either String a)] -- ^ Arguments the test can accept -> ([a] -> Rules ()) -- ^ The Shake script under test -> IO () -- ^ Sleep function, driven by passing @--sleep@ -> IO () testBuildArgs f opts g = shakenEx False opts f (\os args -> if null args then g os else want args >> withoutActions (g os)) testBuild :: (([String] -> IO ()) -> IO ()) -- ^ The test driver -> Rules () -- ^ The Shake script under test -> IO () -- ^ Sleep function, driven by passing @--sleep@ -> IO () testBuild f g = testBuildArgs f [] (const g) testSimple :: IO () -> IO () -> IO () testSimple act = testBuild (const act) (pure ()) testNone :: IO () -> IO () testNone _ = pure () shakenEx :: Bool -> [OptDescr (Either String a)] -> (([String] -> IO ()) -> IO ()) -> ([a] -> [String] -> Rules ()) -> IO () -> IO () shakenEx reenter options test rules sleeper = do initDataDirectory name:args <- getArgs putStrLn $ "## BUILD " ++ unwords (name:args) let forward = "--forward" `elem` args args <- pure $ delete "--forward" args let out = "output/" ++ name ++ "/" let change = if not reenter then withCurrentDirectory out else id let clean = do now <- getCurrentDirectory when (takeBaseName now /= name) $ fail $ "Clean went horribly wrong! Dangerous deleting: " ++ show now withCurrentDirectory (now "..") $ do removePathForcibly now createDirectoryRecursive now unless reenter $ createDirectoryRecursive out case args of "test":_ -> do putStrLn $ "## TESTING " ++ name change $ test (\args -> withArgs (name:args) $ shakenEx True options test rules sleeper) putStrLn $ "## FINISHED TESTING " ++ name "clean":args -> do when (args /= []) $ fail "Unexpected additional arguments to 'clean'" change clean "perturb":args -> forever $ do del <- removeFilesRandom out threads <- randomRIO (1,4) putStrLn $ "## TESTING PERTURBATION (" ++ show del ++ " files, " ++ show threads ++ " threads)" shake shakeOptions{shakeFiles=out, shakeThreads=threads, shakeVerbosity=Error} $ rules [] args args -> change $ do t <- tracker opts <- pure shakeOptions{shakeFiles = "."} cwd <- getCurrentDirectory opts <- pure $ if forward then forwardOptions opts{shakeLintInside=[""]} else opts {shakeLint = Just t ,shakeLintInside = [cwd ".." ".."] ,shakeLintIgnore = [".cabal-sandbox/**",".stack-work/**","../../.stack-work/**"]} withArgs args $ do let optionsBuiltin = optionsEnumDesc [(Clean, "Clean before building.") ,(Sleep, "Pause before executing.") ,(UsePredicate, "Use &?> in preference to &%>")] shakeArgsOptionsWith opts (optionsBuiltin `mergeOptDescr` options) $ \so extra files -> do let (extra1, extra2) = partitionEithers extra when (Clean `elem` extra1) clean when (Sleep `elem` extra1) sleeper so <- pure $ if UsePredicate `notElem` extra1 then so else so{shakeExtra = addShakeExtra UsePredicateYes $ shakeExtra so} if "clean" `elem` files then clean >> pure Nothing else pure $ Just $ (,) so $ do -- if you have passed sleep, suppress the "no actions" warning when (Sleep `elem` extra1) $ action $ pure () rules extra2 files data Flags = Clean -- ^ Clean all the files before starting | Sleep -- ^ Call 'sleepFileTimeCalibrate' before starting | UsePredicate -- ^ Use &?> in preference to &%> deriving (Eq,Show) data UsePredicateYes = UsePredicateYes deriving Typeable (&?%>) :: [FilePattern] -> ([FilePath] -> Action ()) -> Rules () deps &?%> act = do so :: Maybe UsePredicateYes <- getShakeExtraRules if isJust so then (\x -> if x `elem` deps then Just deps else Nothing) &?> act else deps &%> act -- A way to get back to the source files after you get directory changed shakeRoot :: FilePath shakeRoot = "../.." tracker :: IO Lint tracker = do fsatrace <- findExecutable $ "fsatrace" <.> exe -- Tracking on a Mac is pretty unreliable pure $ if not isMac && isJust fsatrace then LintFSATrace else LintBasic -- Tests that don't currently work on CI notCI :: IO () -> IO () notCI act = do b <- lookupEnv "CI" when (isNothing b) act -- Tests that don't currently work on Windows CI notWindowsCI :: IO () -> IO () notWindowsCI = if isWindows then notCI else id -- Tests that don't currently work on Mac CI notMacCI :: IO () -> IO () notMacCI = if isMac then notCI else id hasTracker :: IO Bool hasTracker = do t <- tracker pure $ t == LintFSATrace assertFail :: String -> IO a assertFail msg = error $ "ASSERTION FAILED: " ++ msg assertBool :: Bool -> String -> IO () assertBool b msg = unless b $ assertFail msg assertBoolIO :: IO Bool -> String -> IO () assertBoolIO b msg = do b <- b; assertBool b msg infix 4 === (===) :: (Show a, Eq a) => a -> a -> IO () a === b = assertBool (a == b) $ "failed in ===\nLHS: " ++ show a ++ "\nRHS: " ++ show b assertExists :: FilePath -> IO () assertExists file = do b <- IO.doesFileExist file assertBool b $ "File was expected to exist, but is missing: " ++ file assertMissing :: FilePath -> IO () assertMissing file = do b <- IO.doesFileExist file assertBool (not b) $ "File was expected to be missing, but exists: " ++ file assertWithin :: Seconds -> IO a -> IO a assertWithin n act = do t <- timeout n act case t of Nothing -> assertFail $ "Expected to complete within " ++ show n ++ " seconds, but did not" Just v -> pure v assertContents :: FilePath -> String -> IO () assertContents file want = do got <- IO.readFile' file assertBool (want == got) $ "File contents are wrong: " ++ file ++ "\nWANT: " ++ want ++ "\nGOT: " ++ got assertContentsInfix :: FilePath -> String -> IO () assertContentsInfix file want = do got <- IO.readFile' file assertBool (want `isInfixOf` got) $ "File contents are wrong: " ++ file ++ "\nWANT (infix): " ++ want ++ "\nGOT: " ++ got assertContentsOn :: (String -> String) -> FilePath -> String -> IO () assertContentsOn f file want = do got <- IO.readFile' file assertBool (f want == f got) $ "File contents are wrong: " ++ file ++ "\nWANT: " ++ want ++ "\nGOT: " ++ got ++ "\nWANT (transformed): " ++ f want ++ "\nGOT (transformed): " ++ f got assertContentsWords :: FilePath -> String -> IO () assertContentsWords = assertContentsOn (unwords . words) assertContentsUnordered :: FilePath -> [String] -> IO () assertContentsUnordered file xs = assertContentsOn (unlines . sort . lines) file (unlines xs) assertExceptionAfter :: (String -> String) -> [String] -> IO a -> IO () assertExceptionAfter tweak parts act = do res <- try_ act case res of Left err -> let s = tweak $ show err in forM_ parts $ \p -> assertBool (p `isInfixOf` s) $ "Incorrect exception, missing part:\nGOT: " ++ s ++ "\nWANTED: " ++ p Right _ -> error $ "Expected an exception containing " ++ show parts ++ ", but succeeded" assertException :: [String] -> IO a -> IO () assertException = assertExceptionAfter id assertTimings :: ([String] -> IO ()) -> [(String, Seconds)] -> IO () assertTimings build expect = do build ["--report=report.json","--no-build"] src <- IO.readFile' "report.json" let f ('[':'\"':xs) | (name,_:',':xs) <- break (== '\"') xs , num <- takeWhile (`notElem` ",]") xs , Just num <- readMaybe num = (name, num :: Double) f x = error $ "Failed to parse JSON output in assertTimings, " ++ show x let got = [f x | x <- map drop1 $ lines src, x /= ""] forM_ expect $ \(name, val) -> case lookup name got of Nothing -> assertFail $ "Couldn't find key " ++ show name ++ " in profiling output" Just v -> assertBool (v >= val && v < (val + 1)) $ "Unexpected value, got " ++ show v ++ ", hoping for " ++ show val ++ " (+ 1 sec)" defaultTest :: ([String] -> IO ()) -> IO () defaultTest build = do build ["--abbrev=output=$OUT","-j3","--report"] build ["--no-build","--report=-"] build [] -- | Sleep long enough for the modification time resolution to catch up sleepFileTime :: IO () sleepFileTime = sleep 1 sleepFileTimeCalibrate :: FilePath -> IO (IO ()) sleepFileTimeCalibrate file = do createDirectoryRecursive $ takeDirectory file -- with 10 measurements can get a bit slow, see #451 -- if it rounds to a second then 1st will be a fraction, but 2nd will be full second mtimes <- forM [1..2] $ \i -> fmap fst $ duration $ do writeFile file $ show i let time = fmap (fst . fromMaybe (error "File missing during sleepFileTimeCalibrate")) $ getFileInfo False $ fileNameFromString file t1 <- time flip loopM 0 $ \j -> do writeFile file $ show (i,j) t2 <- time pure $ if t1 == t2 then Left $ j+1 else Right () putStrLn $ "Longest file modification time lag was " ++ show (ceiling (maximum' mtimes * 1000)) ++ "ms" pure $ sleep $ min 1 $ maximum' mtimes * 2 removeFilesRandom :: FilePath -> IO Int removeFilesRandom x = do files <- getDirectoryContentsRecursive x n <- randomRIO (0,length files) rs <- replicateM (length files) (randomIO :: IO Double) mapM_ (removeFile . snd) $ sort $ zip rs files pure n getDirectoryContentsRecursive :: FilePath -> IO [FilePath] getDirectoryContentsRecursive dir = do xs <- IO.getDirectoryContents dir (dirs,files) <- partitionM IO.doesDirectoryExist [dir x | x <- xs, not $ "." `isPrefixOf` x] rest <- concatMapM getDirectoryContentsRecursive dirs pure $ files++rest copyDirectoryChanged :: FilePath -> FilePath -> IO () copyDirectoryChanged old new = do xs <- getDirectoryContentsRecursive old forM_ xs $ \from -> do let to = new drop (length $ addTrailingPathSeparator old) from createDirectoryRecursive $ takeDirectory to copyFileChangedIO from to copyFileChangedIO :: FilePath -> FilePath -> IO () copyFileChangedIO old new = unlessM (liftIO $ IO.doesFileExist new &&^ IO.fileEq old new) $ copyFile old new -- The operators %> ?> &*> &?> |?> |*> all have an isomorphism data Pat = PatWildcard | PatPredicate | PatOrWildcard | PatAndWildcard | PatAndPredicate deriving (Read, Show, Enum, Bounded) pat :: Pat -> FilePattern -> (FilePath -> Action ()) -> Rules () pat PatWildcard p act = p %> act pat PatPredicate p act = (p ?==) ?> act pat PatOrWildcard p act = [p] |%> act pat PatAndWildcard p act = -- single wildcard shortcircuits, so we use multiple to avoid that -- and thus have to fake writing an extra file [p, p ++ "'"] &%> \[x,x'] -> do act x; writeFile' x' "" pat PatAndPredicate p act = (\x -> if p ?== x then Just [x] else Nothing) &?> \[x] -> act x --------------------------------------------------------------------- -- TEST MATERIAL -- Some errors require multiple modules to replicate (e.g. #506), so put that here newtype BinarySentinel a = BinarySentinel () deriving (Eq,Show,NFData,Typeable,Hashable) instance forall a . Typeable a => Binary (BinarySentinel a) where put (BinarySentinel ()) = put $ show (typeRep (Proxy :: Proxy a)) get = do x <- get let want = show (typeRep (Proxy :: Proxy a)) if x == want then pure $ BinarySentinel () else error $ "BinarySentinel failed, got " ++ show x ++ " but wanted " ++ show want newtype RandomType = RandomType (BinarySentinel ()) deriving (Eq,Show,NFData,Typeable,Hashable,Binary) shake-0.19.8/src/Test/Unicode.hs0000644000000000000000000000466607346545000014541 0ustar0000000000000000 module Test.Unicode(main) where import Development.Shake import Development.Shake.FilePath import Test.Type import Data.List.Extra import General.GetOpt import Control.Monad import GHC.IO.Encoding -- | Decode a dull ASCII string to certain unicode points, necessary because -- withArgs (even the UTF8 version) throws an encoding error on the > 256 code points decode :: String -> String decode ('e':'^':xs) = '\xEA' : decode xs -- Latin Small Letter E With Circumflex decode (':':')':xs) = '\x263A' : decode xs -- White Smiling Face decode (x:xs) = x : decode xs decode [] = [] data Arg = Prefix String | Want String opts = [Option "" ["prefix"] (ReqArg (Right . Prefix) "") "" ,Option "" ["want"] (ReqArg (Right . Want) "") ""] main = testBuildArgs test opts $ \xs -> do let pre = lastDef "" [decode x | Prefix x <- xs :: [Arg]] want [decode x | Want x <- xs] pre ++ "dir/*" %> \out -> do let src = takeDirectory (takeDirectory out) takeFileName out copyFile' src out pre ++ ".out" %> \out -> do a <- readFile' $ pre ++ "dir" pre <.> "source" b <- readFile' $ pre <.> "multi1" writeFile' out $ a ++ b ["*.multi1","*.multi2"] &%> \[m1,m2] -> do b <- doesFileExist $ m1 -<.> "exist" writeFile' m1 $ show b writeFile' m2 $ show b test build = do build ["clean"] -- Useful, if the error message starts crashing... -- IO.hSetEncoding IO.stdout IO.char8 -- IO.hSetEncoding IO.stderr IO.char8 forM_ ["normal","e^",":)","e^-:)"] $ \pre -> do -- If you aren't on UTF-8 file encoding it goes wrong, see -- https://github.com/ndmitchell/shake/pull/681 enc <- liftIO getFileSystemEncoding if textEncodingName enc /= "UTF-8" then putStrLn "WARNING: filesystem encoding is not UTF-8, skipping unicode test (LANG=C ?)" else do let ext x = decode pre <.> x writeFile (ext "source") "x" build ["--prefix=" ++ pre, "--want=" ++ pre <.> "out", "--sleep"] assertContents (ext "out") $ "x" ++ "False" writeFile (ext "source") "y" build ["--prefix=" ++ pre, "--want=" ++ pre <.> "out", "--sleep"] assertContents (ext "out") $ "y" ++ "False" writeFile (ext "exist") "" build ["--prefix=" ++ pre, "--want=" ++ pre <.> "out"] assertContents (ext "out") $ "y" ++ "True" shake-0.19.8/src/Test/Util.hs0000644000000000000000000000140407346545000014053 0ustar0000000000000000 module Test.Util(main) where import Development.Shake.Util import Test.Type main = testSimple $ do parseMakefile "" === [] parseMakefile "a:b c\ndef : ee" === [("a",["b","c"]),("def",["ee"])] parseMakefile "a: #comment\n#comment : b\nc : d" === [("a",[]),("c",["d"])] parseMakefile "a \\\n\t:b" === [("a",["b"])] parseMakefile "#comment\\ a : b" === [] parseMakefile "a: b c \\\n d e\n\nf:g" === [("a",["b","c","d","e"]),("f",["g"])] parseMakefile "foo/bar: \\\r\n c:/a1 \\\r\n x\r\n" === [("foo/bar",["c:/a1","x"])] parseMakefile "output.o: src/file/with\\ space.cpp" === [("output.o",["src/file/with space.cpp"])] parseMakefile "a: b\\ c" === [("a",["b ","c"])] parseMakefile "a: b\\ c\\ d e" === [("a",["b c d","e"])] shake-0.19.8/src/Test/Verbosity.hs0000644000000000000000000000223607346545000015130 0ustar0000000000000000 module Test.Verbosity(main) where import Development.Shake import Test.Type main = testBuild test $ do "in.txt" %> \out -> do a <- getVerbosity b <- withVerbosity Info getVerbosity writeFile' out $ unwords $ map show [a,b] "out.txt" %> \out -> do x <- getVerbosity ys <- withVerbosity Verbose $ do a <- getVerbosity need ["in.txt"] -- make sure the inherited verbosity does not get passed along b <- getVerbosity c <- quietly getVerbosity d <- fmap shakeVerbosity getShakeOptions pure [a,b,c,d] z <- getVerbosity writeFile' out $ unwords $ map show $ [x] ++ ys ++ [z] test build = do build ["out.txt","--clean"] assertContents "in.txt" "Info Info" assertContents "out.txt" "Info Verbose Verbose Error Info Info" build ["out.txt","--clean","--verbose"] assertContents "in.txt" "Verbose Info" assertContents "out.txt" "Verbose Verbose Verbose Error Verbose Verbose" build ["out.txt","--clean","--quiet"] assertContents "in.txt" "Warn Info" assertContents "out.txt" "Warn Verbose Verbose Error Warn Warn" shake-0.19.8/src/Test/Version.hs0000644000000000000000000000422107346545000014563 0ustar0000000000000000{-# LANGUAGE TypeFamilies, GeneralizedNewtypeDeriving, DeriveDataTypeable #-} module Test.Version(main) where import Development.Shake import Development.Shake.Classes import General.GetOpt import Data.List.Extra import Text.Read import Test.Type newtype Opts = Ver Int opts = [Option "" ["ver"] (ReqArg (fmap Ver . readEither) "INT") ""] newtype Oracle = Oracle () deriving (Show,Eq,Hashable,Binary,NFData,Typeable) type instance RuleResult Oracle = Int main = testBuildArgs test opts $ \opts -> do want ["foo.txt","ver.txt","oracle.txt"] "foo.txt" %> \file -> liftIO $ appendFile file "x" let ver = headDef 0 [x | Ver x <- opts] versioned ver $ "ver.txt" %> \out -> liftIO $ appendFile out $ show ver versioned ver $ addOracleCache $ \(Oracle ()) -> do liftIO $ appendFile "oracle.in" $ show ver pure $ ver `mod` 2 "oracle.txt" %> \out -> do v <- askOracle $ Oracle () liftIO $ appendFile out $ show v test build = do writeFile "foo.txt" "" v1 <- getHashedShakeVersion ["foo.txt"] writeFile "foo.txt" "y" v2 <- getHashedShakeVersion ["foo.txt"] assertBool (v1 /= v2) "Hashes must not be equal" build ["clean"] build [] assertContents "foo.txt" "x" build ["--rule-version=new"] assertContents "foo.txt" "xx" build ["--rule-version=new"] assertContents "foo.txt" "xx" build ["--rule-version=extra","--silent"] assertContents "foo.txt" "xxx" build ["--rule-version=more","--no-rule-version"] assertContents "foo.txt" "xxx" build ["--rule-version=more"] assertContents "foo.txt" "xxx" build ["--rule-version=final","--silent"] assertContents "foo.txt" "xxxx" build ["clean"] build [] assertContents "ver.txt" "0" assertContents "foo.txt" "x" build ["--ver=0","--silent"] assertContents "ver.txt" "0" build ["--ver=8"] build ["--ver=9","--silent"] build ["--ver=9","--silent"] build ["--ver=3","--silent"] assertContents "ver.txt" "0893" assertContents "oracle.in" "0893" -- when you change version you don't do cutoff assertContents "oracle.txt" "0011" assertContents "foo.txt" "x"