shake-0.16.4/0000755000000000000000000000000013261223302011055 5ustar0000000000000000shake-0.16.4/shake.cabal0000644000000000000000000003111713261223302013137 0ustar0000000000000000cabal-version: >= 1.18 build-type: Simple name: shake version: 0.16.4 license: BSD3 license-file: LICENSE category: Development, Shake author: Neil Mitchell maintainer: Neil Mitchell copyright: Neil Mitchell 2011-2018 synopsis: Build system library, like Make, but more accurate dependencies. description: Shake is a Haskell library for writing build systems - designed as a replacement for @make@. See "Development.Shake" for an introduction, including an example. The homepage contains links to a user manual, an academic paper and further information: . To use Shake the user writes a Haskell program that imports "Development.Shake", defines some build rules, and calls the 'Development.Shake.shakeArgs' function. Thanks to do notation and infix operators, a simple Shake build system is not too dissimilar from a simple Makefile. However, as build systems get more complex, Shake is able to take advantage of the excellent abstraction facilities offered by Haskell and easily support much larger projects. The Shake library provides all the standard features available in other build systems, including automatic parallelism and minimal rebuilds. Shake also provides more accurate dependency tracking, including seamless support for generated files, and dependencies on system information (e.g. compiler version). homepage: https://shakebuild.com bug-reports: https://github.com/ndmitchell/shake/issues tested-with: GHC==8.4.1, GHC==8.2.2, GHC==8.0.2, GHC==7.10.3, GHC==7.8.4, GHC==7.6.3, GHC==7.4.2 extra-doc-files: CHANGES.txt README.md docs/Manual.md docs/shake-progress.png extra-source-files: src/Paths.hs src/Test/C/constants.c src/Test/C/constants.h src/Test/C/main.c src/Test/Ninja/*.ninja src/Test/Ninja/*.output src/Test/Ninja/subdir/*.ninja src/Test/Progress/*.prog src/Test/Tar/list.txt src/Test/Tup/hello.c src/Test/Tup/newmath/root.cfg src/Test/Tup/newmath/square.c src/Test/Tup/newmath/square.h src/Test/Tup/root.cfg data-files: docs/manual/build.bat docs/manual/Build.hs docs/manual/build.sh docs/manual/constants.c docs/manual/constants.h docs/manual/main.c html/profile.html html/progress.html html/shake.js html/viz.js source-repository head type: git location: https://github.com/ndmitchell/shake.git flag portable default: False manual: True description: Obtain FileTime using portable functions library default-language: Haskell2010 hs-source-dirs: src build-depends: base >= 4.5, binary, bytestring, deepseq >= 1.1, directory, extra >= 1.6.1, filepath, hashable >= 1.1.2.3, js-flot, js-jquery, primitive, process >= 1.1, random, time, transformers >= 0.2, unordered-containers >= 0.2.7, utf8-string >= 0.3 if flag(portable) cpp-options: -DPORTABLE if impl(ghc < 7.6) build-depends: old-time else if !os(windows) build-depends: unix >= 2.5.1 if !os(windows) build-depends: unix if impl(ghc < 8.0) build-depends: semigroups >= 0.18 exposed-modules: Development.Shake Development.Shake.Classes Development.Shake.Command Development.Shake.Config Development.Shake.FilePath Development.Shake.Forward Development.Shake.Rule Development.Shake.Util other-modules: Development.Ninja.Env Development.Ninja.Lexer Development.Ninja.Parse Development.Ninja.Type Development.Shake.Internal.Args Development.Shake.Internal.CmdOption Development.Shake.Internal.Core.Action Development.Shake.Internal.Core.Database Development.Shake.Internal.Core.Monad Development.Shake.Internal.Core.Pool Development.Shake.Internal.Core.Rendezvous Development.Shake.Internal.Core.Rules Development.Shake.Internal.Core.Run Development.Shake.Internal.Core.Storage Development.Shake.Internal.Core.Types Development.Shake.Internal.Demo Development.Shake.Internal.Derived Development.Shake.Internal.Errors Development.Shake.Internal.FileInfo Development.Shake.Internal.FileName Development.Shake.Internal.FilePattern Development.Shake.Internal.Options Development.Shake.Internal.Paths Development.Shake.Internal.Profile Development.Shake.Internal.Progress Development.Shake.Internal.Resource Development.Shake.Internal.Rules.Directory Development.Shake.Internal.Rules.File Development.Shake.Internal.Rules.Files Development.Shake.Internal.Rules.Oracle Development.Shake.Internal.Rules.OrderOnly Development.Shake.Internal.Rules.Rerun Development.Shake.Internal.Shake Development.Shake.Internal.Value General.Bag General.Bilist General.Binary General.Chunks General.Cleanup General.Concurrent General.Extra General.FileLock General.GetOpt General.Ids General.Intern General.ListBuilder General.Makefile General.Process General.Template General.Timing Paths_shake executable shake default-language: Haskell2010 hs-source-dirs: src ghc-options: -main-is Run.main main-is: Run.hs ghc-options: -rtsopts -- GHC bug 7646 means -threaded causes errors if impl(ghc >= 7.8) ghc-options: -threaded "-with-rtsopts=-I0 -qg -qb" build-depends: base == 4.*, binary, bytestring, deepseq >= 1.1, directory, extra >= 1.6.1, filepath, hashable >= 1.1.2.3, js-flot, js-jquery, primitive, process >= 1.1, random, time, transformers >= 0.2, unordered-containers >= 0.2.7, utf8-string >= 0.3 if flag(portable) cpp-options: -DPORTABLE if impl(ghc < 7.6) build-depends: old-time else if !os(windows) build-depends: unix >= 2.5.1 if !os(windows) build-depends: unix if impl(ghc < 8.0) build-depends: semigroups >= 0.18 other-modules: Development.Ninja.All Development.Ninja.Env Development.Ninja.Lexer Development.Ninja.Parse Development.Ninja.Type Development.Shake Development.Shake.Classes Development.Shake.Command Development.Shake.FilePath Development.Shake.Internal.Args Development.Shake.Internal.CmdOption Development.Shake.Internal.Core.Action Development.Shake.Internal.Core.Database Development.Shake.Internal.Core.Monad Development.Shake.Internal.Core.Pool Development.Shake.Internal.Core.Rendezvous Development.Shake.Internal.Core.Rules Development.Shake.Internal.Core.Run Development.Shake.Internal.Core.Storage Development.Shake.Internal.Core.Types Development.Shake.Internal.Demo Development.Shake.Internal.Derived Development.Shake.Internal.Errors Development.Shake.Internal.FileInfo Development.Shake.Internal.FileName Development.Shake.Internal.FilePattern Development.Shake.Internal.Options Development.Shake.Internal.Paths Development.Shake.Internal.Profile Development.Shake.Internal.Progress Development.Shake.Internal.Resource Development.Shake.Internal.Rules.Directory Development.Shake.Internal.Rules.File Development.Shake.Internal.Rules.Files Development.Shake.Internal.Rules.Oracle Development.Shake.Internal.Rules.OrderOnly Development.Shake.Internal.Rules.Rerun Development.Shake.Internal.Shake Development.Shake.Internal.Value General.Bag General.Bilist General.Binary General.Chunks General.Cleanup General.Concurrent General.Extra General.FileLock General.GetOpt General.Ids General.Intern General.ListBuilder General.Makefile General.Process General.Template General.Timing Paths_shake test-suite shake-test default-language: Haskell2010 type: exitcode-stdio-1.0 main-is: Test.hs hs-source-dirs: src ghc-options: -main-is Test.main -rtsopts if impl(ghc >= 7.6) -- space leak introduced by -O1 in 7.4, see #445 ghc-options: -with-rtsopts=-K1K if impl(ghc >= 7.8) -- GHC bug 7646 (fixed in 7.8) means -threaded causes errors ghc-options: -threaded build-depends: base == 4.*, binary, bytestring, deepseq >= 1.1, directory, extra >= 1.6.1, filepath, hashable >= 1.1.2.3, js-flot, js-jquery, primitive, process >= 1.1, QuickCheck >= 2.0, random, time, transformers >= 0.2, unordered-containers >= 0.2.7, utf8-string >= 0.3 if flag(portable) cpp-options: -DPORTABLE if impl(ghc < 7.6) build-depends: old-time else if !os(windows) build-depends: unix >= 2.5.1 if !os(windows) build-depends: unix if impl(ghc < 8.0) build-depends: semigroups >= 0.18 other-modules: Development.Ninja.All Development.Ninja.Env Development.Ninja.Lexer Development.Ninja.Parse Development.Ninja.Type Development.Shake Development.Shake.Classes Development.Shake.Command Development.Shake.Config Development.Shake.FilePath Development.Shake.Forward Development.Shake.Internal.Args Development.Shake.Internal.CmdOption Development.Shake.Internal.Core.Action Development.Shake.Internal.Core.Database Development.Shake.Internal.Core.Monad Development.Shake.Internal.Core.Pool Development.Shake.Internal.Core.Rendezvous Development.Shake.Internal.Core.Rules Development.Shake.Internal.Core.Run Development.Shake.Internal.Core.Storage Development.Shake.Internal.Core.Types Development.Shake.Internal.Demo Development.Shake.Internal.Derived Development.Shake.Internal.Errors Development.Shake.Internal.FileInfo Development.Shake.Internal.FileName Development.Shake.Internal.FilePattern Development.Shake.Internal.Options Development.Shake.Internal.Paths Development.Shake.Internal.Profile Development.Shake.Internal.Progress Development.Shake.Internal.Resource Development.Shake.Internal.Rules.Directory Development.Shake.Internal.Rules.File Development.Shake.Internal.Rules.Files Development.Shake.Internal.Rules.Oracle Development.Shake.Internal.Rules.OrderOnly Development.Shake.Internal.Rules.Rerun Development.Shake.Internal.Shake Development.Shake.Internal.Value Development.Shake.Rule Development.Shake.Util General.Bag General.Bilist General.Binary General.Chunks General.Cleanup General.Concurrent General.Extra General.FileLock General.GetOpt General.Ids General.Intern General.ListBuilder General.Makefile General.Process General.Template General.Timing Paths_shake Run Test.Basic Test.Batch Test.Benchmark Test.C Test.Cache Test.Command Test.Config Test.Digest Test.Directory Test.Docs Test.Errors Test.Existence Test.FileLock Test.FilePath Test.FilePattern Test.Files Test.Forward Test.Journal Test.Lint Test.Live Test.Manual Test.Match Test.Monad Test.Ninja Test.Oracle Test.OrderOnly Test.Parallel Test.Pool Test.Progress Test.Random Test.Rebuild Test.Resources Test.Self Test.Tar Test.Tup Test.Type Test.Unicode Test.Util Test.Verbosity Test.Version shake-0.16.4/Setup.hs0000644000000000000000000000005613261223302012512 0ustar0000000000000000import Distribution.Simple main = defaultMain shake-0.16.4/README.md0000644000000000000000000000473013261223302012340 0ustar0000000000000000# Shake [![Hackage version](https://img.shields.io/hackage/v/shake.svg?label=Hackage)](https://hackage.haskell.org/package/shake) [![Stackage version](https://www.stackage.org/package/shake/badge/lts?label=Stackage)](https://www.stackage.org/package/shake) [![Linux Build Status](https://img.shields.io/travis/ndmitchell/shake.svg?label=Linux%20build)](https://travis-ci.org/ndmitchell/shake) [![Windows Build Status](https://img.shields.io/appveyor/ci/ndmitchell/shake.svg?label=Windows%20build)](https://ci.appveyor.com/project/ndmitchell/shake) Shake is a tool for writing build systems - an alternative to make, Scons, Ant etc. Shake has been used commercially for over five years, running thousands of builds per day. The website for Shake users is at [shakebuild.com](https://shakebuild.com). #### Documentation * **Why use Shake?** Shake lets you write large robust build systems, which deal properly with generated source files and run quickly. If you are writing a custom build system of any moderate size (more than a few rules) you should use Shake. The advantages over other build systems are detailed in the document [Why choose Shake?](https://shakebuild.com/why). * **How do I use Shake?** Shake is a Haskell library that you use to define your rules. The [Shake manual](https://shakebuild.com/manual) provides a walk through of a small but realistic example, assuming no Haskell knowledge. * [Generated documentation](https://hackage.haskell.org/packages/archive/shake/latest/doc/html/Development-Shake.html) for all functions, includes lots of examples. * [Blog posts](https://neilmitchell.blogspot.co.uk/search/label/shake) detailing ongoing development work. * [Academic paper](https://ndmitchell.com/downloads/paper-shake_before_building-10_sep_2012.pdf) on the underlying principles behind Shake. #### Other links * [Download the Haskell package](https://hackage.haskell.org/package/shake) from Hackage and install it using Cabal. * [Mailing list](https://groups.google.com/forum/?fromgroups#!forum/shake-build-system) for any questions/bugs/thoughts on Shake. If you need more information and aren't sure where to start, use the mailing list. * [Questions](https://stackoverflow.com/questions/tagged/shake-build-system) can be asked on StackOverflow with the tag `shake-build-system`. Is your company using Shake? Write something public (even just a [tweet to `@ndm_haskell`](https://twitter.com/ndm_haskell)) and I'll include a link [on the website](https://shakebuild.com/#who-uses-shake). shake-0.16.4/LICENSE0000644000000000000000000000276413261223302012073 0ustar0000000000000000Copyright Neil Mitchell 2011-2018. All rights reserved. Redistribution and use in source and binary forms, with or without modification, are permitted provided that the following conditions are met: * Redistributions of source code must retain the above copyright notice, this list of conditions and the following disclaimer. * Redistributions in binary form must reproduce the above copyright notice, this list of conditions and the following disclaimer in the documentation and/or other materials provided with the distribution. * Neither the name of Neil Mitchell nor the names of other contributors may be used to endorse or promote products derived from this software without specific prior written permission. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. shake-0.16.4/CHANGES.txt0000644000000000000000000006632313261223302012700 0ustar0000000000000000Changelog for Shake 0.16.4, released 2018-04-04 #185, add addOracleCache which doesn't always rerun #576, remove incorrect Cabal description #575, print --help message at Quiet verbosity or above 0.16.3, released 2018-02-23 Allow reading data files from next to the executable #566, require extra-1.6.1 #565, deal with multi-file rules that don't create their contents Improve the documentation of getDirectoryContents 0.16.2, released 2018-02-13 Set the correct lower bound on unordered-containers of 0.2.7 0.16.1, released 2018-02-11 Support process-1.6.3.0 #290, add withTempFileInside and withTempDirInside #549, Semigroup-Monoid proposal support Hadrian-462, fail gracefully if on-disk formats change #469, only print timing information at normal verbosity or above #340, add runAfter #372, deprecate askOracleWith #520, only record each dependency once #548, rebuild the Ninja files, if they change, restart #547, make Ninja --lint checking see through phonys #546, support Ninja line continuations in more places #470, make --verbose and --quiet help text clearer #534, make writeFile' and writeFileChanged create the directory #543, add resultHasChanged to track results which have changed Add getShakeExtraRules, like getShakeOptionsRules for shakeExtra #535, add shakeArgsOptionsWith to manipulate ShakeOptions #538, make sure processes can be killed properly #502, add needHasChanged to track what in a need has changed Weaken the typeclasses required for apply/apply1 #502, add batch function to batch commands with high startup Avoid parallel duplicating local state such as discounting Prefer to resume tasks rather than start new ones 0.16, released 2017-09-20 #536, make --skip work for oracles Ensure shakeOutput is used more consistently #49, add shakeColor and --color flags #490, recommend -threaded as standard #517, ignore ./ in FilePattern Require extra-1.5.3 #499, add a filePattern function, like ?== but with the matches #474, never spawn user actions unmasked Allow user arguments to replace builtin arguments #522, make copyFile create directories if necessary #516, add an example for withTempDir #514, expose more about cmd arguments #523, #524, make sure phony doesn't run its dependencies first #515, add cmd_ function #506, allow duplicate type names in different modules #503, require shakeExtra to obey the sensible invariants #503, add getShakeExtra/addShakeExtra #492, fix the single letter flag documentation Expose 'Process' from Development.Shake #495, remove dangling link from LICENSE #436, remove Assume, switch to Rebuild #419, remove --assume-old and --assume-new, which never worked Remove support for running Makefile scripts Add getShakeOptionsRules, to get ShakeOptions in Rules #479, improve the robustness of the Pool tests #481, document how to raise errors in Action Delete the deprecated system* functions #427, check stored value after checking dependencies Significant changes to defining custom rules Delete the deprecated defaultRule IMPORTANT: Incompatible on disk format change #428, don't persist errors to the database 0.15.11, released 2017-01-18 #488, make sure parallel tracks dependencies #513, permit process-1.4.3.0 and above 0.15.10, released 2016-06-14 #465, fix phony names which clash with directories 0.15.9, released 2016-06-12 Documentation tweaks Optimise the thread pool Fix progress output through wget 0.15.8, released 2016-06-01 Extra base bounds to rule out building on GHC 7.4 0.15.7, released 2016-06-01 Require extra-1.4.8 #457, better error message when calling need on a directory #456, use displayException for inner exceptions Improve the documentation for priority/alternative #448, fix FSATrace on Mac 10.11 #444, optimise copyFileChanged #58, change writeFile functions to MonadIO 0.15.6, released 2016-04-19 Require the primitive package #431, make it an error to need a directory #393, add orderOnlyAction, a general version of orderOnly #408, explain that alwaysRerun has no immediate effect #410, add --digest-not flag to select ChangeModtime #416, export ShakeValue from the main module only #352, improve the grammar in the profiling document #399, explain what in getDirectoryFiles is tracked #108, add getDirectoryFilesIO, an untracked version #401, document getDirectoryFiles exception behaviour #401, fix getDirectoryFiles error on pattern directory missing #400, detect indirect recursion in build rules #402, give better errors on recursive rules #369, clarify the docs for putQuiet/putLoud #405, make sure phony targets with \ work on Windows Require extra-1.4.3 #398, only require old-time with GHC 7.4 #239, make Rules an instance of MonadIO #384, add shakeExtra field to ShakeOptions for user data #374, make --debug=file write out newlines #308, make sure Shell and AutoDeps work together #365, fix unicode output with putNormal #269, add parallel execution via parallel/forP/par #310, preserve quoting around cmd #333, optimise the random pool, up to 20% faster #335, add deriving Read instances for more types Add Development.Shake.Forward for forward-defined systems #336, export :-> #337, fix the docs for &%> #297, make run the query profile viewer #305, make profiling work with newer Chrome #327, Add AutoDeps feature #325, add FileStdin to specify the stdin comes from a file Make an empty Stdin different from no Stdin (inherit) in cmd #320, add ** as a relative-only alternative to // in patterns #283, change the semantics of removeFiles, no implicit deletes #283, speed up getDirectoryFiles on patterns involving literals Work with stack test Add RemEnv cmd option Fix a small space leak on Linux Fix a space leak in profile summary information Properly handle *//* in file pattern (at least one directory) #303, allow ninja rule names to have dots in them Properly handle /// in file pattern (a wildcard then /) Make sure Windows separators work in |%> FilePattern rules #296, fix multifile rules when using --digest-and-input #286, document the relationship between getEnv and lookupEnv #285, improve the documentation for phony/alwaysRerun #284, fix the documentation for normaliseEx #293, add getProgress #270, support MonadFix for Rules, allows recursive cache #295, discount time spent waiting for a cache #294, give better error messages if you hit GHC bug #10793 #4, add a lock file to stop corrupting databases 0.15.5, released 2015-08-04 #279, make usingConfigFile do a need on the config file Fix a bug where predicted progress could sometimes be ?? #264, make the the suite more less non-deterministic 0.15.4, released 2015-06-20 Undo a locally modified file 0.15.3, released 2015-06-20 #254, in staunch mode, print out all exceptions Require extra-1.3 #259, discount time waiting for a resource in profiles #252, have the docs test configure not in dist 0.15.2, released 2015-05-22 #248, add another example of using cmd #245, initial support for fsatrace lint checking Reexport -<.> from filepath where available Hoogle #106, trigger on filepath version, not GHC version Add AddEnv and AddPath command options #243, close fds in child processes when spawning commands Make Ninja variable handling more accurate 0.15.1, released 2015-04-25 If you have Shakefile.hs, pass it all arguments without interp Add shakeArgsPrune and shakeArgsPruneWith #228, allow running cmd async by collecting the ProcessHandle Make getShakeOptions/processorCount of 0 return the used value #212, document how to get a full terminal with cmd #225, warn if there are no want/action statements #232, don't ignore phony order-only dependencies in Ninja #226, add escaping for GraphViz labels #227, add StdinBS for passing a bytestring as Stdin Make cmd Timeout call terminateProcess as well 0.15, released 2015-03-26 #203, make shakeFiles a directory rather than a file prefix #220, add getHashedShakeVersion helper #220, add shakeVersionIgnore to ignore version numbers #219, run Shakefile.hs from the shake binary #218, fix issues with incorrect unchanging with no digests #218, fix issue with ChangeModtimeAndDigest on unchanging files #216, work around GHC 7.10 RC3 bug 10176 #213, add phonys, a predicate phony rule Add CmdTime and CmdLine results to cmd/command Fix parseMakefile for words with multiple escapes in them #205, add WithStdout, like WithStderr #27, add support for capturing Stdout/Stderr with bytestrings Add FileStdout/FileStderr to write a stream direct to a file #211, add Stdouterr to capture both Stdout and Stderr streams Require extra-1.1 (to use nubOrd) Generalise cmd to work with Maybe [String] Add unit for use with cmd IMPORTANT: Incompatible on disk format change #209, improve orderOnly dependencies 0.14.3, released 2015-01-07 Support for the filepath shipped with GHC 7.10 Add Timeout option to command 0.14.2, released 2014-11-27 #198, add operator to join FilePatterns #198, fix the <.> and other extension methods to work with // 0.14.1, released 2014-11-21 #196, change the links to point at http://www.shakebuild.com/ Improve the error messages when &%> or &?> go wrong Normalise file patterns used by &?> 0.14, released 2014-11-19 Make FilePattern on Windows cope with all path separators Rename normalise to normaliseEx Expose native System.FilePath in Development.Shake.FilePath #193, eliminate the custom version of #193, make need/orderOnly call normalise on their arguments #177, use the $PATH variable to resolve cmd #182, use js-jquery and js-flot #195, add getConfigKeys #194, replace *> with %> as the operator for rules #188, improve the docs on FilePattern #187, removeFiles on a missing directory should do nothing #166, add withTempFile and withTempDir in the Action monad Ensure Ninja response files are always deleted Fix incorrect units when showing profile to the console Require the extra library #178, make --help show the current executable name 0.13.4, released 2014-09-19 #171, fix the --demo mode on Linux 0.13.3, released 2014-09-16 Ensure you wait until the progress thread cleans up Add --demo mode Add --progress=record and --progress=replay Add -j to run with one thread per processor Fix progress reporting to work even on the first run Apply abbreviations to staunch failure messages Fix failure progress messages during staunch mode #165, make |*> matching with simple files much faster Add toStandard to the FilePath module #130, support msvc_deps_prefix in Ninja #157, allow variable substitution in include/subninja statements #147, add the version in the profile reports #147, add --numeric-version flag #28, switch to continuation passing style, uses far less threads #156, add readConfigFileWithEnv for configs with an initial env #129, fix processor count detection on Mac with -threaded #155, add --live and shakeLiveFiles #153, handle escaped spaces in parseMakefile #152, fix typo in the manual Add copyFileChanged Fix the leaf column in profile reports Add Maybe instances for cmd arguments Add a --profile flag (alias for --report) Switch to an underlying continuation monad 0.13.2, released 2014-06-20 #95, ensure progress never gets corrupted #124, add a profile report demo #128, allow long Ninja command lines Fix --report=- for builds with no commands in them 0.13.1, released 2014-05-28 Remove all package upper bounds #126, Ninja compatibility if Ninja fails to create a file #123, generate Chrome compatible traces 0.13, released 2014-05-27 #122, make --report=- write a report to stdout Improve the profile report summary #122, turn shakeReport into a list of files, instead of a Maybe #60, improve how command lines are printed out #113, print info about copyFile' and removeFilesAfter at -V Replace **> with |*> , ?>> with &?> and *>> with &*> IMPORTANT: Incompatible on disk format change #83, support digest rules for files #83, add shakeChange parameter and --digest* args #83, add equalValue function to Rule typeclass Deprecate defaultRule Make literal *> matches take precedence over wildcard matches #120, add a priority function 0.12, released 2014-05-09 #62, move to a ReaderT/IORef for the Action monad Add DEPRECATED pragmas on system' calls Delete Development.Shake.Sys, use command or cmd instead Add a 'console' pool to Ninja Avoid using System.Cmd (deprecated in GHC HEAD) #41, use higher precision file times on POSIX #117, use higher precision times for Unicode files on Windows #118, add support for Ninja -t compdb #119, more test fixes for Linux GHC 7.8 0.11.7, released 2014-05-05 #119, test fixes for Linux GHC 7.8 0.11.6, released 2014-05-04 #114, run build.sh in the test suite with sh #115, fix the test suite on Linux GHC 7.8 #116, fix for GHC 7.10 0.11.5, released 2014-04-25 Include the sources for flot and jQuery in the dist tarball Note that .. in FilePattern values is unlikely to be correct #109, make removeFiles delete directories that match #84, make removeFiles not delete empty unmatched directories #111, fixes to the user manual #110, don't give a warning if -threaded is turned off #103, don't suggest -qg and -qb, they fail without -threaded #102, fix up when the LHS starts with . 0.11.4, released 2014-03-20 Work with QuickCheck-2.7 (which defines ===) #100, handle GraphViz missing gracefully Fix up the profiling report generation #99, add getEnvWithDefault 0.11.3, released 2014-02-27 #97, fix a serialisation bug when > 254 arguments to need 0.11.2, released 2014-02-18 #96, fix a bug in addPath that caused $PATH to be added twice 0.11.1, released 2014-02-06 #94, GHC 7.8 support Add a Config module #89, support :: as a build rule separator 0.11, released 2014-01-03 Add alternatives to allow overlapping rules Make storedValue take a ShakeOptions structure Generalise the newCache function Improve the performance of the Ninja parser Make the database more compact #84, ensure you normalise removeFile patterns first #82, make -j0 guess at the number of processors #81, add --lint-tracker to use tracker.exe Add trackRead, trackWrite Add trackUse, trackChange, trackAllow #85, move rule creation functions into Development.Shake.Rule Mark Development.Shake.Sys as DEPRECATED with a pragma Change shakeLint to be of type Maybe Lint, instead of Bool #50, add shakeArgsAccumulate 0.10.10, released 2013-11-16 Improve Ninja --lint checking 0.10.9, released 2013-11-15 #76, add Ninja specific lint checking #75, add orderOnly dependencies #76, add needed, to assert a need doesn't rebuild #78, don't use cabal in the test suite #77, add shakeRunCommands and --skip-commands #67, add withVerbosity #51, add getShakeOptions Lots of Haddock improvements Deprecate system', use cmd or command instead #53, add addPath and addEnv to modify environment variables Make all search path things return the native separators #73, if Haskell can't write a unicode filename, skip the test Print out the entire database in diagnostics mode Rewrite database writes, old code could break metadata on error #25, optimise building with Ninja Fix many bugs in FilePath.normalise Require QuickCheck-2.0 or higher Change how the makefile parser treats \ characters Add a --no-build flag #74, allow cmd to be used with result IO Add a Util module, with Makefile parsing 0.10.8, released 2013-10-15 Allow unix-1.5.1 again Require Cabal 1.10 or above Convert to the cabal test compatible test suite 0.10.7, released 2013-09-26 Support time-1.2 #71, fix a bug when requesting all three outputs from command #64, add progressProgram and use it in progressSimple Remove isRunning, kill the progress thread on completion #47, improve the performance of ?==, especially on "//*" #68, improve the docs for addOracle #55, ensure if you need phony targets you rebuild every time #52, ensure all command output is printed #20, document preferred compilation flags #20, speed up the Shake executable with GHC RTS flags #39, print out more of the version string on database change #41, require unix-2.6 or above #48, make it clear filetimes are hashes #43, improve the error messages on lint failures #45, avoid use of nub #45, avoid use of intersect which is O(n^2) #26, add newThrottle to create throttling resources #26, add unsafeExtraThread to run more than shakeThreads #32, add a check that withResources does not call need #614, support unicode filenames Require unordered-containers-0.2.1 or above Reduce stack usage when lint checking #24, create the output directories for the multiple file rule Improvements to match Ninja behaviour 0.10.6, released 2013-06-28 Include command/cmd in the default module #16, change the scoping in subninja #15, fix up the Ninja default targets Fix up --version output 0.10.5, released 2013-06-14 Improve progress prediction for the first build Fix removeFiles when there is a directory argument and a pattern Delete shakeDeterministic, shakeThreads=1 has the same effect 0.10.4, released 2013-06-13 Fix writeFileChanged for line endings on Windows Support arguments to --progress Set the number of capabilities based on -j flags Add shakeTimings, to time the separate stages Add AssumeSkip, mostly for benchmarking Normalise file arguments before calling want Expose ShakeException Add isFailure to ShakeProgress, and display in progressDisplay Add withResources Add -<.> alias for replaceExtension Add selectors on Stdout/Stdin/Exit Add CmdResult instance for ExitCode 0.10.3, released 2013-05-12 Upgrade to Flot-0.8 Small documentation markup fixes 0.10.2, released 2013-03-30 Require process-1.1 or above Improve progress guesses (use exponential decay of work rate) Improve profiling report error messages Add a Command module, with alternatives to system' etc. #593, don't assume doesDirectoryExist = not . doesFileExist #594, swap how traced messages are printed out Add Chatty verbosity, for what was Loud Make repeated --verbose keep increasing the verbosity If shakeThreads==1 then always use a deterministic pool Add shakeLineBuffering, defaulting to True Improve the performance of profiling Documentation and corrections improvements Eliminate StepKey from the profiling output Add a --no-time flag Makefile improvements Avoid use of ghc-options in .cabal file 0.10.1, released 2013-03-18 Allow the shake executable to build 0.10, released 2013-03-17 Add phony, for writing phony actions Add removeFilesAfter Switch to the new profiling code Add actionOnException and actionFinally Add shakeArgsWith Rename shakeWithArgs to shakeArgs and change the clean argument Remove the -f alias for --flush Add a shake executable which runs Makefiles Rename shake to shake-test Change how progress messages are written Do not write a beep in progressDisplay, avoids a beep Add exe function to FilePath 0.9.1, released 2013-02-28 GHC head (7.7) compatibility by removing the Rules fundep 0.9, released 2013-02-16 Make the tests work on Linux Fix report generation on machines whose text format is not ASCII Make Directory functions work the same on Linux/Mac Change shakeVersion to be a String Stop alwaysRerun causing lint failures Make shakeLint check that the current directory does not change Add shakeOutput setting Add removeFiles function Add Sys module for writing consise system calls Add getEnv function Add shakeWithArgs and shakeOptDescrs Add newCache to cache the parsed contents of files Add newResourceIO and move newResource into the Rules monad Add shakeStorageLog, to help diagnosing obscure database errors 0.8, released 2013-01-31 Improve the Oracle documentation Allow getDirectoryFiles to operate recursively 0.7, released 2013-01-29 Change getDirectoryFiles to take a set of file patterns Add doesDirectoryExist function Add shakeAbbreviations feature Add a new Progress module for sensible progress messages Spawn shakeProgress on a separate thread, a safer default Improve the runtime error messages Add a quietly function, to hide traced commands Print main status messages when running a traced command Display the exit code when a system command fails Fix AssumeClean when the result exists but has never been built IMPORTANT: Incompatible on disk format change Change the storage to not write messages on --silent Add Applicative on Rules Add Applicative on Action 0.6, released 2013-01-09 Make addOracle return a result of askOracle Export the class bodies from Classes Don't export ShakeValue from Classes 0.5, released 2013-01-05 IMPORTANT: Incompatible on disk format change Add the ShakeValue constraint synonym Change the Oracle to be strongly typed Add a Classes module 0.4, released 2013-01-02 Add shakeFlush to control how often flush is called Fix a serious space leak in the thread pool #502, add shakeAssume to assume files are clean/dirty Remove validStored, replace with storedValue Remove the default validStored class, almost never right Remove Eq/Ord from ShakeOptions (now contains a func) #501, add statistics gathering functions to help report progress Ensure registering witnesses is thread safe Ensure GHC does not over-optimise type registration 0.3.10, released 2012-12-17 Add Paths.hs so you can run the tests from the tarball 0.3.9, released 2012-12-17 Don't rely on toList returning a consistent order Allow hashable-1.2 0.3.8, released 2012-12-12 Fix up FilePattern so "foo//bar" ?== "foo/bar" 0.3.7, released 2012-12-02 Update the cabal file 0.3.6, released 2012-11-27 Add addOracles, for implementing more advanced oracles Add withoutActions, for implementing command line arguments 0.3.5, released 2012-11-27 #571, vastly improve the correctness of FilePattern #574, documentation typos Expose rulePriority 0.3.4, released 2012-09-17 Update documentation with links to ICFP 2012 paper/talk 0.3.3, released 2012-09-17 Minor refactorings 0.3.2, released 2012-09-12 Fix cabal specification on non-Windows 0.3.1, released 2012-09-08 Don't use syntax also stolen by QuasiQuotes Avoid warnings about unused monadic values Fix up getModificationTime for GHC 7.6 Don't assume catch is exported by Prelude (GHC 7.6 compat) Improve the error message when failing to build with *>> or ?>> Introduce a portable flag, to switch to portable FileTime ops Improve the error message when failing to build a file Ensure errors raised bubble up quickly Significant improvements to the profile output Only trace the program name by default, not the entire command Allow unordered-containers < 0.2 again Intern all keys, so they are only stored once Optimise modification time checking on Unix Optimise modification time checking on Windows Make some fields smaller on disk IMPORTANT: Incompatible on disk format change Switch to ByteString for storing File Add shakeDeterministic, along with a default random pool Make the Monad/Monoid instance for Rules strict 0.3, released 2012-06-17 #550, ensure the journal thread is locked #550, if your thread dies while reading the database then die #550, ensure you reset to before the slop before writing If the witness table is completely corrupt give a better error Make storage work with exceptions whose messages raise errors Journal error conditions, to ensure they don't repeat Better messages on invalid database versions Complete rewrite of the storage layer and journal design IMPORTANT: Incompatible on disk format change 0.2.11, released 2012-05-19 IMPORTANT: #546, don't save the database to where it was created 0.2.10, released 2012-05-19 Add systemCwd for running with a particular current directory Ensure the database is saved to where it was created #545, make sure if shake is killed, it kills its children 0.2.9, released 2012-03-25 Require unordered-containers 2.1 or higher, fixes bugs Allow transformers-0.3.* Add a random script generator and tester Improve the documentation for withResource Fix the unordered-containers constraint, was too loose Don't write report generation messages in Quiet mode Add ?>> operator, a generalisation of *>> 0.2.8, released 2012-02-24 Rename shakeDump to shakeReport Add profile report generator 0.2.7, released 2012-02-18 #530, require unordered-containers >= 0.1.4.3 0.2.6, released 2012-02-17 Improve the documentation code fragments (more links) Add support for managing finite resources 0.2.5, released 2012-02-15 Require hashable-1.1.2.3, which has a TypeRep instance Add Data/Typeable instances for ShakeOptions Add Data instance for Verbosity 0.2.4, released 2012-02-11 Include the C example source files Significant documentation improvements 0.2.3, released 2012-02-09 Create the shakeFiles directory if missing 0.2.2, released 2012-02-05 Allow deepseq-1.3.* Add a basic lint checking mode Remove the Dirty state entirely (was incorrect) 0.2.1, released 2012-01-28 Put diagnostics in more places Add a C example 0.2, released 2012-01-23 Add shakeStaunch Rename shakeParallel to shakeThreads Delete the lint support (never actually worked) Completely rewrite the central build algorithm Add verbosity Diagnostic Improve FilePath.normalise Add writeFileChanged Add systemOutput Add an Oracle module Add an explicit Verbosity type Change to lower precedence for *> etc 0.1.5, released 2012-01-01 Allow deepseq-1.2 0.1.4, released 2012-01-01 Don't export currentRule/currentStack Fix **>, was matching the wrong way round Fix FilePath normalise so it works properly on Windows Properly detect recursive rules, instead of looping Add *>> for building multiple files at once 0.1.3, released 2011-12-22 Fix a bug where a file rule could return with a lazy error Make sure all the files are stored in binary, not text Fix a pattern match error in getDirectoryFiles 0.1.2, released 2011-12-18 Add a warning in the description 0.1.1, released 2011-12-18 Significantly improved documentation system' now takes an initial argument for the program 0.1, released 2011-12-18 Many changes to signatures, some basic documentation Additional demo, to build shake itself 0.0, released 2011-12-11 Initial version, not ready for public use shake-0.16.4/src/0000755000000000000000000000000013261223302011644 5ustar0000000000000000shake-0.16.4/src/Test.hs0000644000000000000000000001204513261223301013120 0ustar0000000000000000 module Test(main) where import Control.Applicative import Control.Exception import Control.Monad import Data.Maybe import System.Environment.Extra import General.Timing import Development.Shake.Internal.FileInfo import Development.Shake.Internal.FileName import qualified Data.ByteString.Char8 as BS import Test.Type(sleepFileTimeCalibrate) import Control.Concurrent.Extra import Prelude import qualified Test.Basic as Basic import qualified Test.Batch as Batch import qualified Test.Benchmark as Benchmark import qualified Test.C as C import qualified Test.Cache as Cache import qualified Test.Command as Command import qualified Test.Config as Config import qualified Test.Digest as Digest import qualified Test.Directory as Directory import qualified Test.Docs as Docs import qualified Test.Errors as Errors import qualified Test.Existence as Existence import qualified Test.FileLock as FileLock import qualified Test.FilePath as FilePath import qualified Test.FilePattern as FilePattern import qualified Test.Files as Files import qualified Test.Forward as Forward import qualified Test.Journal as Journal import qualified Test.Lint as Lint import qualified Test.Live as Live import qualified Test.Manual as Manual import qualified Test.Match as Match import qualified Test.Monad as Monad import qualified Test.Ninja as Ninja import qualified Test.Oracle as Oracle import qualified Test.OrderOnly as OrderOnly import qualified Test.Parallel as Parallel import qualified Test.Pool as Pool import qualified Test.Progress as Progress import qualified Test.Random as Random import qualified Test.Rebuild as Rebuild import qualified Test.Resources as Resources import qualified Test.Self as Self import qualified Test.Tar as Tar import qualified Test.Tup as Tup import qualified Test.Unicode as Unicode import qualified Test.Util as Util import qualified Test.Verbosity as Verbosity import qualified Test.Version as Version import qualified Run fakes = ["clean" * clean, "test" * test, "make" * makefile, "filetime" * filetime] where (*) = (,) mains = ["basic" * Basic.main ,"batch" * Batch.main ,"benchmark" * Benchmark.main ,"c" * C.main ,"cache" * Cache.main ,"command" * Command.main ,"config" * Config.main ,"digest" * Digest.main ,"directory" * Directory.main ,"docs" * Docs.main ,"errors" * Errors.main ,"existence" * Existence.main ,"filelock" * FileLock.main ,"filepath" * FilePath.main ,"filepattern" * FilePattern.main ,"files" * Files.main ,"forward" * Forward.main ,"journal" * Journal.main ,"lint" * Lint.main ,"live" * Live.main ,"manual" * Manual.main ,"match" * Match.main ,"monad" * Monad.main ,"ninja" * Ninja.main ,"oracle" * Oracle.main ,"orderonly" * OrderOnly.main ,"parallel" * Parallel.main ,"pool" * Pool.main ,"progress" * Progress.main ,"random" * Random.main ,"rebuild" * Rebuild.main ,"resources" * Resources.main ,"self" * Self.main ,"tar" * Tar.main ,"tup" * Tup.main ,"unicode" * Unicode.main ,"util" * Util.main ,"verbosity" * Verbosity.main ,"version" * Version.main] where (*) = (,) main :: IO () main = do resetTimings xs <- getArgs exePath <- getExecutablePath case flip lookup (fakes ++ mains) =<< listToMaybe xs of _ | null xs -> do putStrLn "******************************************************************" putStrLn "** Running shake test suite, run with '--help' to see arguments **" putStrLn "******************************************************************" withArgs ["test"] main withArgs ["random","test","3m"] main Nothing -> putStrLn $ unlines ["Welcome to the Shake demo" ,"" ,unwords $ "Modes:" : map fst fakes ,unwords $ "Demos:" : map fst mains ,"" ,"As an example, try:" ,"" ,unwords [" ", exePath, "self", "--jobs=2", "--trace"] ,"" ,"Which will build Shake, using Shake, on 2 threads."] Just main -> main =<< sleepFileTimeCalibrate makefile :: IO () -> IO () makefile _ = do args <- getArgs withArgs (drop 1 args) Run.main filetime :: IO () -> IO () filetime _ = do args <- getArgs addTiming "Reading files" files <- fmap concat $ forM (drop 1 args) $ \file -> BS.lines . BS.filter (/= '\r') <$> BS.readFile file let n = length files evaluate n addTiming "Modtime" let (a,bcd) = splitAt (n `div` 4) files let (b,cd) = splitAt (n `div` 4) bcd let (c,d) = splitAt (n `div` 4) cd vars <- forM [a,b,c,d] $ \xs -> onceFork $ mapM_ (getFileInfo . fileNameFromByteString) xs sequence_ vars printTimings clean :: IO () -> IO () clean extra = sequence_ [withArgs [name,"clean"] $ main extra | (name,main) <- mains] test :: IO () -> IO () test yield = do args <- getArgs flip onException (putStrLn "TESTS FAILED") $ sequence_ [withArgs (name:"test":drop 1 args) $ test yield | (name,test) <- mains, name /= "random"] shake-0.16.4/src/Run.hs0000644000000000000000000000356313261223301012752 0ustar0000000000000000 module Run(main) where import Development.Ninja.All import System.Environment import Development.Shake import Development.Shake.FilePath import General.Timing(resetTimings) import Control.Monad.Extra import Control.Exception.Extra import Data.Maybe import qualified System.Directory as IO import General.Extra import General.GetOpt import System.Process import System.Exit main :: IO () main = do resetTimings args <- getArgs hsExe <- findFile [".shake" "shake" <.> exe ,"Shakefile.hs","Shakefile.lhs"] case hsExe of Just file -> do (prog,args) <- return $ if takeExtension file `elem` [".hs",".lhs"] then ("runhaskell", file:args) else (toNative file, args) e <- rawSystem prog args when (e /= ExitSuccess) $ exitWith e Nothing -> do let go = shakeArgsWith shakeOptions{shakeCreationCheck=False} flags $ \opts targets -> do let tool = listToMaybe [x | Tool x <- opts] makefile <- case reverse [x | UseMakefile x <- opts] of x:_ -> return x _ -> do res <- findFile ["build.ninja"] case res of Just x -> return x Nothing -> errorIO "Could not find `build.ninja'" runNinja go makefile targets tool withArgs ("--no-time":args) go data Flag = UseMakefile FilePath | Tool String flags = [Option "f" ["file","makefile"] (ReqArg (Right . UseMakefile) "FILE") "Read FILE as a makefile." ,Option "t" ["tool"] (ReqArg (Right . Tool) "TOOL") "Ninja-compatible tools." ] findFile :: [FilePath] -> IO (Maybe FilePath) findFile = findM (fmap (either (const False) id) . tryIO . IO.doesFileExist) shake-0.16.4/src/Paths.hs0000644000000000000000000000052313261223302013257 0ustar0000000000000000-- | Fake cabal module for local building module Paths_shake(getDataDir, version) where import Data.Version.Extra -- If Shake can't find files in the data directory it tries relative to the executable getDataDir :: IO FilePath getDataDir = return "random_path_that_cannot_possibly_exist" version :: Version version = makeVersion [0,0] shake-0.16.4/src/Test/0000755000000000000000000000000013261223302012563 5ustar0000000000000000shake-0.16.4/src/Test/Version.hs0000644000000000000000000000164013261223301014544 0ustar0000000000000000 module Test.Version(main) where import Development.Shake import Test.Type main = shakeTest_ test $ do want ["foo.txt"] "foo.txt" %> \file -> liftIO $ appendFile file "x" test build = do writeFile "foo.txt" "" v1 <- getHashedShakeVersion ["foo.txt"] writeFile "foo.txt" "y" v2 <- getHashedShakeVersion ["foo.txt"] assertBool (v1 /= v2) "Hashes must not be equal" build ["clean"] build [] assertContents "foo.txt" "x" build ["--rule-version=new","--silent"] assertContents "foo.txt" "xx" build ["--rule-version=new"] assertContents "foo.txt" "xx" build ["--rule-version=extra","--silent"] assertContents "foo.txt" "xxx" build ["--rule-version=more","--no-rule-version"] assertContents "foo.txt" "xxx" build ["--rule-version=more"] assertContents "foo.txt" "xxx" build ["--rule-version=final","--silent"] assertContents "foo.txt" "xxxx" shake-0.16.4/src/Test/Verbosity.hs0000644000000000000000000000222413261223301015104 0ustar0000000000000000 module Test.Verbosity(main) where import Development.Shake import Test.Type main = shakeTest_ test $ do "in.txt" %> \out -> do a <- getVerbosity b <- withVerbosity Normal getVerbosity writeFile' out $ unwords $ map show [a,b] "out.txt" %> \out -> do x <- getVerbosity ys <- withVerbosity Loud $ do a <- getVerbosity need ["in.txt"] -- make sure the inherited verbosity does not get passed along b <- getVerbosity c <- quietly getVerbosity d <- fmap shakeVerbosity getShakeOptions return [a,b,c,d] z <- getVerbosity writeFile' out $ unwords $ map show $ [x] ++ ys ++ [z] test build = do build ["out.txt","--clean"] assertContents "in.txt" "Normal Normal" assertContents "out.txt" "Normal Loud Loud Quiet Normal Normal" build ["out.txt","--clean","--verbose"] assertContents "in.txt" "Loud Normal" assertContents "out.txt" "Loud Loud Loud Quiet Loud Loud" build ["out.txt","--clean","--quiet"] assertContents "in.txt" "Quiet Normal" assertContents "out.txt" "Quiet Loud Loud Quiet Quiet Quiet" shake-0.16.4/src/Test/Util.hs0000644000000000000000000000144213261223301014034 0ustar0000000000000000 module Test.Util(main) where import Development.Shake.Util import Test.Type main = shakeTest_ test $ return () test build = do parseMakefile "" === [] parseMakefile "a:b c\ndef : ee" === [("a",["b","c"]),("def",["ee"])] parseMakefile "a: #comment\n#comment : b\nc : d" === [("a",[]),("c",["d"])] parseMakefile "a \\\n\t:b" === [("a",["b"])] parseMakefile "#comment\\ a : b" === [] parseMakefile "a: b c \\\n d e\n\nf:g" === [("a",["b","c","d","e"]),("f",["g"])] parseMakefile "foo/bar: \\\r\n c:/a1 \\\r\n x\r\n" === [("foo/bar",["c:/a1","x"])] parseMakefile "output.o: src/file/with\\ space.cpp" === [("output.o",["src/file/with space.cpp"])] parseMakefile "a: b\\ c" === [("a",["b ","c"])] parseMakefile "a: b\\ c\\ d e" === [("a",["b c d","e"])] shake-0.16.4/src/Test/Unicode.hs0000644000000000000000000000441513261223301014510 0ustar0000000000000000 module Test.Unicode(main) where import Development.Shake import Development.Shake.FilePath import Test.Type import General.GetOpt import Control.Exception.Extra import Control.Monad -- | Decode a dull ASCII string to certain unicode points, necessary because -- withArgs (even the UTF8 version) throws an encoding error on the > 256 code points decode :: String -> String decode ('e':'^':xs) = '\xEA' : decode xs -- Latin Small Letter E With Circumflex decode (':':')':xs) = '\x263A' : decode xs -- White Smiling Face decode (x:xs) = x : decode xs decode [] = [] data Arg = Prefix String | Want String opts = [Option "" ["prefix"] (ReqArg (Right . Prefix) "") "" ,Option "" ["want"] (ReqArg (Right . Want) "") ""] main = shakeTest test opts $ \xs -> do let pre = last $ "" : [decode x | Prefix x <- xs :: [Arg]] want [decode x | Want x <- xs] pre ++ "dir/*" %> \out -> do let src = takeDirectory (takeDirectory out) takeFileName out copyFile' src out pre ++ ".out" %> \out -> do a <- readFile' $ pre ++ "dir" pre <.> "source" b <- readFile' $ pre <.> "multi1" writeFile' out $ a ++ b ["*.multi1","*.multi2"] &%> \[m1,m2] -> do b <- doesFileExist $ m1 -<.> "exist" writeFile' m1 $ show b writeFile' m2 $ show b test build = do build ["clean"] -- Useful, if the error message starts crashing... -- IO.hSetEncoding IO.stdout IO.char8 -- IO.hSetEncoding IO.stderr IO.char8 forM_ ["normal","e^",":)","e^-:)"] $ \pre -> do let ext x = decode pre <.> x res <- try_ $ writeFile (ext "source") "x" case res of Left err -> putStrLn $ "WARNING: Failed to write file " ++ pre ++ ", skipping unicode test (LANG=C ?)" Right _ -> do build ["--prefix=" ++ pre, "--want=" ++ pre <.> "out", "--sleep"] assertContents (ext "out") $ "x" ++ "False" writeFile (ext "source") "y" build ["--prefix=" ++ pre, "--want=" ++ pre <.> "out", "--sleep"] assertContents (ext "out") $ "y" ++ "False" writeFile (ext "exist") "" build ["--prefix=" ++ pre, "--want=" ++ pre <.> "out"] assertContents (ext "out") $ "y" ++ "True" shake-0.16.4/src/Test/Type.hs0000644000000000000000000002666313261223301014054 0ustar0000000000000000{-# LANGUAGE GeneralizedNewtypeDeriving, DeriveDataTypeable, ScopedTypeVariables #-} module Test.Type( sleep, sleepFileTime, sleepFileTimeCalibrate, shakeTest, shakeTest_, root, noTest, hasTracker, copyDirectoryChanged, copyFileChanged, assertWithin, assertBool, assertBoolIO, assertException, assertContents, assertContentsUnordered, assertContentsWords, assertExists, assertMissing, (===), (&?%>), Pat(PatWildcard), pat, BinarySentinel(..), RandomType(..), ) where import Development.Shake hiding (copyFileChanged) import Development.Shake.Classes import Development.Shake.Forward import Development.Shake.Internal.FileName import General.Extra import Development.Shake.Internal.FileInfo import Development.Shake.FilePath import Development.Shake.Internal.Paths import Control.Exception.Extra import Control.Monad.Extra import Data.List import Data.Maybe import Data.Either import Data.Typeable.Extra import qualified Data.ByteString as BS import System.Directory.Extra as IO import System.Environment.Extra import System.Random import General.GetOpt import System.IO.Extra as IO import System.Time.Extra import Prelude shakeTest :: (([String] -> IO ()) -> IO ()) -- ^ The test driver -> [OptDescr (Either String a)] -- ^ Arguments the test can accept -> ([a] -> Rules ()) -- ^ The Shake script under test -> IO () -- ^ Sleep function, driven by passing @--sleep@ -> IO () shakeTest f opts g = shakenEx False opts f (\os args -> if null args then g os else want args >> withoutActions (g os)) shakeTest_ :: (([String] -> IO ()) -> IO ()) -- ^ The test driver -> Rules () -- ^ The Shake script under test -> IO () -- ^ Sleep function, driven by passing @--sleep@ -> IO () shakeTest_ f g = shakeTest f [] (const g) shakenEx :: Bool -> [OptDescr (Either String a)] -> (([String] -> IO ()) -> IO ()) -> ([a] -> [String] -> Rules ()) -> IO () -> IO () shakenEx reenter options test rules sleeper = do initDataDirectory name:args <- getArgs putStrLn $ "## BUILD " ++ unwords (name:args) let forward = "--forward" `elem` args args <- return $ delete "--forward" args cwd <- getCurrentDirectory let out = "output/" ++ name ++ "/" let obj x = if null x then "." else x let change = if not reenter then withCurrentDirectory out else id let clean = do now <- getCurrentDirectory when (takeBaseName now /= name) $ fail $ "Clean went horribly wrong! Dangerous deleting: " ++ show now withCurrentDirectory (now "..") $ do removeDirectoryRecursive now createDirectoryRecursive now unless reenter $ createDirectoryRecursive out case args of "test":extra -> do putStrLn $ "## TESTING " ++ name -- if the extra arguments are not --quiet/--loud it's probably going to go wrong -- as it is, they do go wrong for random, so disabling for now change $ test (\args -> withArgs (name:args {- ++ extra -}) $ shakenEx True options test rules sleeper) putStrLn $ "## FINISHED TESTING " ++ name "clean":_ -> change clean "perturb":args -> forever $ do del <- removeFilesRandom out threads <- randomRIO (1,4) putStrLn $ "## TESTING PERTURBATION (" ++ show del ++ " files, " ++ show threads ++ " threads)" shake shakeOptions{shakeFiles=out, shakeThreads=threads, shakeVerbosity=Quiet} $ rules [] args args -> do t <- tracker opts <- return $ shakeOptions {shakeFiles = obj "" ,shakeReport = [obj "report.html"]} opts <- return $ if forward then forwardOptions opts else opts {shakeLint = Just t ,shakeLintInside = [cwd] ,shakeLintIgnore = map (cwd ) [".cabal-sandbox//",".stack-work//"]} withArgs args $ do let optionsBuiltin = optionsEnumDesc [(Clean, "Clean before building.") ,(Sleep, "Pause before executing.") ,(UsePredicate, "Use &?> in preference to &%>")] change $ shakeArgsOptionsWith opts (optionsBuiltin `mergeOptDescr` options) $ \so extra files -> do let (extra1, extra2) = partitionEithers extra when (Clean `elem` extra1) clean when (Sleep `elem` extra1) sleeper so <- return $ if UsePredicate `notElem` extra1 then so else so{shakeExtra = addShakeExtra UsePredicateYes $ shakeExtra so} if "clean" `elem` files then clean >> return Nothing else return $ Just $ (,) so $ do -- if you have passed sleep, supress the "no actions" warning when (Sleep `elem` extra1) $ action $ return () rules extra2 files data Flags = Clean -- ^ Clean all the files before starting | Sleep -- ^ Call 'sleepFileTimeCalibrate' before starting | UsePredicate -- ^ Use &?> in preference to &%> deriving (Eq,Show) data UsePredicateYes = UsePredicateYes deriving Typeable (&?%>) :: [FilePattern] -> ([FilePath] -> Action ()) -> Rules () deps &?%> act = do so :: Maybe UsePredicateYes <- getShakeExtraRules if isJust so then (\x -> if x `elem` deps then Just deps else Nothing) &?> act else deps &%> act root :: FilePath root = "../.." tracker :: IO Lint tracker = do fsatrace <- findExecutable $ "fsatrace" <.> exe return $ if isJust fsatrace then LintFSATrace else LintBasic hasTracker :: IO Bool hasTracker = do t <- tracker return $ t == LintFSATrace assertBool :: Bool -> String -> IO () assertBool b msg = unless b $ error $ "ASSERTION FAILED: " ++ msg assertBoolIO :: IO Bool -> String -> IO () assertBoolIO b msg = do b <- b; assertBool b msg infix 4 === (===) :: (Show a, Eq a) => a -> a -> IO () a === b = assertBool (a == b) $ "failed in ===\nLHS: " ++ show a ++ "\nRHS: " ++ show b assertExists :: FilePath -> IO () assertExists file = do b <- IO.doesFileExist file assertBool b $ "File was expected to exist, but is missing: " ++ file assertMissing :: FilePath -> IO () assertMissing file = do b <- IO.doesFileExist file assertBool (not b) $ "File was expected to be missing, but exists: " ++ file assertWithin :: Seconds -> IO () -> IO () assertWithin n act = do t <- timeout n act when (isNothing t) $ assertBool False $ "Expected to complete within " ++ show n ++ " seconds, but did not" assertContents :: FilePath -> String -> IO () assertContents file want = do got <- IO.readFile' file assertBool (want == got) $ "File contents are wrong: " ++ file ++ "\nWANT: " ++ want ++ "\nGOT: " ++ got assertContentsOn :: (String -> String) -> FilePath -> String -> IO () assertContentsOn f file want = do got <- IO.readFile' file assertBool (f want == f got) $ "File contents are wrong: " ++ file ++ "\nWANT: " ++ want ++ "\nGOT: " ++ got ++ "\nWANT (transformed): " ++ f want ++ "\nGOT (transformed): " ++ f got assertContentsWords :: FilePath -> String -> IO () assertContentsWords = assertContentsOn (unwords . words) assertContentsUnordered :: FilePath -> [String] -> IO () assertContentsUnordered file xs = assertContentsOn (unlines . sort . lines) file (unlines xs) assertException :: [String] -> IO () -> IO () assertException parts act = do res <- try_ act case res of Left err -> let s = show err in forM_ parts $ \p -> assertBool (p `isInfixOf` s) $ "Incorrect exception, missing part:\nGOT: " ++ s ++ "\nWANTED: " ++ p Right _ -> error $ "Expected an exception containing " ++ show parts ++ ", but succeeded" noTest :: ([String] -> IO ()) -> IO () noTest build = do build ["--abbrev=output=$OUT","-j3"] build ["--no-build","--report=-"] build [] -- | Sleep long enough for the modification time resolution to catch up sleepFileTime :: IO () sleepFileTime = sleep 1 sleepFileTimeCalibrate :: IO (IO ()) sleepFileTimeCalibrate = do let file = "output/calibrate" createDirectoryRecursive $ takeDirectory file -- with 10 measurements can get a bit slow, see #451 -- if it rounds to a second then 1st will be a fraction, but 2nd will be full second mtimes <- forM [1..2] $ \i -> fmap fst $ duration $ do writeFile file $ show i let time = fmap (fst . fromMaybe (error "File missing during sleepFileTimeCalibrate")) $ getFileInfo $ fileNameFromString file t1 <- time flip loopM 0 $ \j -> do writeFile file $ show (i,j) t2 <- time return $ if t1 == t2 then Left $ j+1 else Right () putStrLn $ "Longest file modification time lag was " ++ show (ceiling (maximum' mtimes * 1000)) ++ "ms" return $ sleep $ min 1 $ maximum' mtimes * 2 removeFilesRandom :: FilePath -> IO Int removeFilesRandom x = do files <- getDirectoryContentsRecursive x n <- randomRIO (0,length files) rs <- replicateM (length files) (randomIO :: IO Double) mapM_ (removeFile . snd) $ sort $ zip rs files return n getDirectoryContentsRecursive :: FilePath -> IO [FilePath] getDirectoryContentsRecursive dir = do xs <- IO.getDirectoryContents dir (dirs,files) <- partitionM IO.doesDirectoryExist [dir x | x <- xs, not $ "." `isPrefixOf` x] rest <- concatMapM getDirectoryContentsRecursive dirs return $ files++rest copyDirectoryChanged :: FilePath -> FilePath -> IO () copyDirectoryChanged old new = do xs <- getDirectoryContentsRecursive old forM_ xs $ \from -> do let to = new drop (length $ addTrailingPathSeparator old) from createDirectoryRecursive $ takeDirectory to copyFileChanged from to copyFileChanged :: FilePath -> FilePath -> IO () copyFileChanged old new = do good <- IO.doesFileExist new good <- if not good then return False else liftM2 (==) (BS.readFile old) (BS.readFile new) unless good $ copyFile old new -- The operators %> ?> &*> &?> |?> |*> all have an isomorphism data Pat = PatWildcard | PatPredicate | PatOrWildcard | PatAndWildcard | PatAndPredicate deriving (Read, Show, Enum, Bounded) pat :: Pat -> FilePattern -> (FilePath -> Action ()) -> Rules () pat PatWildcard p act = p %> act pat PatPredicate p act = (p ?==) ?> act pat PatOrWildcard p act = [p] |%> act pat PatAndWildcard p act = -- single wildcard shortcircuits, so we use multiple to avoid that -- and thus have to fake writing an extra file [p, p ++ "'"] &%> \[x,x'] -> do act x; writeFile' x' "" pat PatAndPredicate p act = (\x -> if p ?== x then Just [x] else Nothing) &?> \[x] -> act x --------------------------------------------------------------------- -- TEST MATERIAL -- Some errors require multiple modules to replicate (e.g. #506), so put that here newtype BinarySentinel a = BinarySentinel () deriving (Eq,Show,NFData,Typeable,Hashable) instance forall a . Typeable a => Binary (BinarySentinel a) where put (BinarySentinel x) = put $ show (typeRep (Proxy :: Proxy a)) get = do x <- get let want = show (typeRep (Proxy :: Proxy a)) if x == want then return $ BinarySentinel () else error $ "BinarySentinel failed, got " ++ show x ++ " but wanted " ++ show want newtype RandomType = RandomType (BinarySentinel ()) deriving (Eq,Show,NFData,Typeable,Hashable,Binary) shake-0.16.4/src/Test/Tup.hs0000644000000000000000000000264213261223301013672 0ustar0000000000000000 module Test.Tup(main) where import Development.Shake import Development.Shake.Config import Development.Shake.FilePath import Development.Shake.Util import Test.Type import Control.Applicative import Data.Maybe import Prelude main = shakeTest_ noTest $ do -- Example inspired by http://gittup.org/tup/ex_multiple_directories.html usingConfigFile $ root "src/Test/Tup/root.cfg" action $ do keys <- getConfigKeys need [x -<.> exe | x <- keys, takeExtension x == ".exe"] let objects dir key = do let f x | takeExtension x == ".c" = dir x -<.> "o" | takeExtension x == ".a" = takeBaseName x "lib" ++ x | otherwise = error $ "Unknown extension, " ++ x x <- fromMaybe (error $ "Missing config key, " ++ key) <$> getConfig key return $ map f $ words x (\x -> x -<.> exe == x) ?> \out -> do os <- objects "" $ takeBaseName out <.> "exe" need os cmd "gcc" os "-o" [out] "//lib*.a" %> \out -> do os <- objects (drop 3 $ takeBaseName out) $ drop 3 $ takeFileName out need os cmd "ar crs" [out] os "//*.o" %> \out -> do let src = root "src/Test/Tup" out -<.> "c" need [src] cmd_ "gcc -c -MMD -MF" [out -<.> "d"] [src] "-o" [out] "-O2 -Wall" ["-I" ++ root "src/Test/Tup/newmath"] neededMakefileDependencies $ out -<.> "d" shake-0.16.4/src/Test/Tar.hs0000644000000000000000000000051613261223301013646 0ustar0000000000000000 module Test.Tar(main) where import Development.Shake import System.FilePath import Test.Type main = shakeTest_ noTest $ do want ["result.tar"] "result.tar" %> \out -> do contents <- fmap (map (root )) $ readFileLines $ root "src/Test/Tar/list.txt" need contents cmd "tar -cf" [out] contents shake-0.16.4/src/Test/Self.hs0000644000000000000000000000641213261223301014012 0ustar0000000000000000{-# LANGUAGE GeneralizedNewtypeDeriving, DeriveDataTypeable, TypeFamilies #-} module Test.Self(main) where import Development.Shake import Development.Shake.Classes import Development.Shake.FilePath import Test.Type import Control.Applicative import Control.Monad.Extra import Data.Char import Data.List.Extra import System.Info import Data.Version.Extra import Prelude newtype GhcPkg = GhcPkg () deriving (Show,Typeable,Eq,Hashable,Binary,NFData) newtype GhcFlags = GhcFlags () deriving (Show,Typeable,Eq,Hashable,Binary,NFData) type instance RuleResult GhcPkg = [String] type instance RuleResult GhcFlags = [String] main = shakeTest_ noTest $ do let moduleToFile ext xs = replace "." "/" xs <.> ext want ["Main" <.> exe] -- fixup to cope with Cabal's generated files let fixPaths x = if x == "Paths_shake.hs" then "Paths.hs" else x ghcPkg <- addOracle $ \GhcPkg{} -> do Stdout out <- quietly $ cmd "ghc-pkg list --simple-output" return $ words out ghcFlags <- addOracle $ \GhcFlags{} -> map ("-package=" ++) <$> readFileLines ".pkgs" let ghc args = do -- since ghc-pkg includes the ghc package, it changes if the version does ghcPkg $ GhcPkg () flags <- ghcFlags $ GhcFlags () cmd "ghc" flags args "Main" <.> exe %> \out -> do src <- readFileLines "Run.deps" let os = map (moduleToFile "o") $ "Run" : src need os ghc $ ["-o",out] ++ os "//*.deps" %> \out -> do dep <- readFileLines $ out -<.> "dep" let xs = map (moduleToFile "deps") dep need xs ds <- nubOrd . sort . (++) dep <$> concatMapM readFileLines xs writeFileLines out ds "//*.dep" %> \out -> do src <- readFile' $ root "src" fixPaths (out -<.> "hs") let xs = hsImports src xs <- filterM (doesFileExist . (\x -> root "src" x) . fixPaths . moduleToFile "hs") xs writeFileLines out xs ["//*.o","//*.hi"] &%> \[out,_] -> do deps <- readFileLines $ out -<.> "deps" let hs = root "src" fixPaths (out -<.> "hs") need $ hs : map (moduleToFile "hi") deps ghc ["-c",hs,"-i" ++ root "src","-main-is","Run.main" ,"-hide-all-packages","-outputdir=." ,"-DPORTABLE","-fwarn-unused-imports"] -- to test one CPP branch ".pkgs" %> \out -> do src <- readFile' $ root "shake.cabal" writeFileLines out $ sort $ cabalBuildDepends src --------------------------------------------------------------------- -- GRAB INFORMATION FROM FILES hsImports :: String -> [String] hsImports xs = [ takeWhile (\x -> isAlphaNum x || x `elem` "._") $ dropWhile (not . isUpper) x | x <- concatMap (wordsBy (== ';')) $ lines xs, "import " `isPrefixOf` trim x] -- FIXME: Should actually parse the list from the contents of the .cabal file cabalBuildDepends :: String -> [String] cabalBuildDepends _ = packages ++ ["unix" | os /= "mingw32"] packages = words ("base transformers binary unordered-containers hashable time bytestring primitive " ++ "filepath directory process deepseq random utf8-string extra js-jquery js-flot") ++ ["old-time" | compilerVersion < makeVersion [7,6]] ++ ["semigroups" | compilerVersion < makeVersion [8,0]] shake-0.16.4/src/Test/Resources.hs0000644000000000000000000000575513261223301015104 0ustar0000000000000000 module Test.Resources(main) where import Development.Shake import Test.Type import Data.List import System.FilePath import Control.Exception.Extra import System.Time.Extra import Control.Monad import Data.IORef main = shakeTest_ test $ do -- test I have good Ord and Show do r1 <- newResource "test" 2 r2 <- newResource "special" 67 unless (r1 < r2 || r2 < r1) $ fail "Resources should have a good ordering" unless ("special" `isInfixOf` show r2) $ fail "Resource should contain their name when shown" -- test you are capped to a maximum value do let cap = 2 inside <- liftIO $ newIORef 0 res <- newResource "test" cap phony "cap" $ need ["c_file" ++ show i ++ ".txt" | i <- [1..4]] "c_*.txt" %> \out -> withResource res 1 $ do old <- liftIO $ atomicModifyIORef inside $ \i -> (i+1,i) when (old >= cap) $ fail "Too many resources in use at one time" liftIO $ sleep 0.1 liftIO $ atomicModifyIORef inside $ \i -> (i-1,i) writeFile' out "" -- test things can still run while you are blocked on a resource do done <- liftIO $ newIORef 0 lock <- newResource "lock" 1 phony "schedule" $ need $ map ("s_" ++) $ "lock1":"done":["free" ++ show i | i <- [1..10]] ++ ["lock2"] "s_done" %> \out -> do need ["s_lock1","s_lock2"] done <- liftIO $ readIORef done when (done < 10) $ fail "Not all managed to schedule while waiting" writeFile' out "" "s_lock*" %> \out -> do withResource lock 1 $ liftIO $ sleep 0.5 writeFile' out "" "s_free*" %> \out -> do liftIO $ atomicModifyIORef done $ \i -> (i+1,()) writeFile' out "" -- test that throttle works properly do res <- newThrottle "throttle" 2 0.4 phony "throttle" $ need ["t_file1.1","t_file2.1","t_file3.2","t_file4.1","t_file5.2"] "t_*.*" %> \out -> do withResource res (read $ drop 1 $ takeExtension out) $ when (takeBaseName out == "t_file3") $ liftIO $ sleep 0.2 writeFile' out "" test build = do build ["-j2","cap","--clean"] build ["-j4","cap","--clean"] build ["-j10","cap","--clean"] build ["-j2","schedule","--clean"] forM_ ["-j1","-j8"] $ \flags -> -- we are sometimes over the window if the machine is "a bit loaded" at some particular time -- therefore we rerun the test three times, and only fail if it fails on all of them retry 3 $ do (s, _) <- duration $ build [flags,"throttle","--no-report","--clean"] -- the 0.1s cap is a guess at an upper bound for how long everything else should take -- and should be raised on slower machines assertBool (s >= 1.4 && s < 1.6) $ "Bad throttling, expected to take 1.4s + computation time (cap of 0.2s), took " ++ show s ++ "s" shake-0.16.4/src/Test/Rebuild.hs0000644000000000000000000000420313261223301014503 0ustar0000000000000000 module Test.Rebuild(main) where import Development.Shake import Test.Type import Text.Read.Extra import Control.Monad import General.GetOpt data Opt = Timestamp String | Pattern Pat opts = [Option "" ["timestamp"] (ReqArg (Right . Timestamp) "VALUE") "Value used to detect what has rebuilt when" ,Option "" ["pattern"] (ReqArg (fmap Pattern . readEither) "PATTERN") "Which file rules to use (%>, &?> etc)"] main = shakeTest test opts $ \args -> do let timestamp = concat [x | Timestamp x <- args] let p = last $ PatWildcard : [x | Pattern x <- args] want ["a.txt"] pat p "a.txt" $ \out -> do src <- readFile' "b.txt" writeFile' out $ src ++ timestamp pat p "b.txt" $ \out -> do src <- readFile' "c.txt" writeFile' out $ src ++ timestamp test build = forM_ [minBound..maxBound :: Pat] $ \pat -> do build ["clean"] let go arg c b a flags = do writeFileChanged "c.txt" c build $ ["--timestamp=" ++ arg, "--sleep","--no-reports","--pattern=" ++ show pat] ++ flags assertContents "b.txt" b assertContents "a.txt" a -- check rebuild works go "1" "x" "x1" "x11" [] go "2" "x" "x1" "x11" [] go "3" "x" "x1" "x13" ["--rebuild=a.*"] go "4" "x" "x1" "x13" [] go "5" "x" "x5" "x55" ["--rebuild=b.*"] go "6" "x" "x6" "x66" ["--rebuild"] go "7" "x" "x6" "x66" [] go "8" "y" "y8" "y88" [] -- check skip works go "1" "x" "x1" "x11" [] go "2" "y" "y2" "x11" ["--skip=a.*"] go "3" "y" "y2" "y23" [] go "4" "z" "y2" "y23" ["--skip=b.*"] go "5" "z" "y2" "y23" ["--skip=b.*"] go "6" "z" "z6" "z66" [] go "7" "a" "z6" "z66" ["--skip=c.*"] go "8" "a" "z6" "z66" ["--skip=b.*"] go "9" "a" "a9" "z66" ["--skip=a.*"] go "0" "a" "a9" "a90" [] {- -- check skip-forever works -- currently it does not work properly go "1" "x" "x1" "x11" [] go "2" "y" "y2" "x11" ["--skip-forever=a.*"] go "3" "y" "y2" "x11" [] go "4" "z" "z4" "z44" [] -} shake-0.16.4/src/Test/Random.hs0000644000000000000000000001167313261223301014346 0ustar0000000000000000{-# LANGUAGE PatternGuards #-} module Test.Random(main) where import Development.Shake import Numeric.Extra import Test.Type import Control.Applicative import Control.Exception.Extra import Control.Monad import Data.List import Data.Maybe import General.GetOpt import System.Environment import System.Exit import System.Random import General.Extra import qualified System.IO.Extra as IO import System.Time.Extra import Prelude inputRange = [1..10] data Value = Single Int | Multiple [[Value]] deriving (Read,Show,Eq) data Source = Input Int | Output Int | Bang deriving (Read,Show) data Logic = Logic Int [[Source]] | Want [Int] deriving (Read,Show) arg = [Option "" ["arg"] (ReqArg Right "") ""] main = shakeTest test arg $ \args -> do let toFile (Input i) = "input-" ++ show i ++ ".txt" toFile (Output i) = "output-" ++ show i ++ ".txt" toFile Bang = error "BANG" let randomSleep = liftIO $ do i <- randomRIO (0, 25) sleep $ intToDouble i / 100 forM_ (map read $ filter (isNothing . asDuration) args) $ \x -> case x of Want xs -> want $ map (toFile . Output) xs Logic out srcs -> toFile (Output out) %> \out -> do res <- fmap (show . Multiple) $ forM srcs $ \src -> do randomSleep need $ map toFile src mapM (liftIO . fmap read . IO.readFile' . toFile) src randomSleep writeFileChanged out res asDuration :: String -> Maybe Double asDuration x | "s" `isSuffixOf` x, [(i,"")] <- reads $ init x = Just i | "m" `isSuffixOf` x, [(i,"")] <- reads $ init x = Just $ i * 60 | otherwise = Nothing test build = do limit <- do args <- getArgs let bound = listToMaybe $ reverse $ mapMaybe asDuration args time <- offsetTime return $ when (isJust bound) $ do now <- time when (now > fromJust bound) exitSuccess forM_ [1..] $ \count -> do limit putStrLn $ "* PERFORMING RANDOM TEST " ++ show count build ["clean"] build [] -- to create the directory forM_ inputRange $ \i -> writeFile ("input-" ++ show i ++ ".txt") $ show $ Single i logic <- randomLogic runLogic [] logic chng <- filterM (const randomIO) inputRange forM_ chng $ \i -> writeFile ("input-" ++ show i ++ ".txt") $ show $ Single $ negate i runLogic chng logic forM_ inputRange $ \i -> writeFile ("input-" ++ show i ++ ".txt") $ show $ Single i logicBang <- addBang =<< addBang logic j <- randomRIO (1::Int,8) res <- try_ $ build $ "--exception" : ("-j" ++ show j) : map ((++) "--arg=" . show) (logicBang ++ [Want [i | Logic i _ <- logicBang]]) case res of Left err | "BANG" `isInfixOf` show err -> return () -- error I expected | otherwise -> error $ "UNEXPECTED ERROR: " ++ show err _ -> return () -- occasionally we only put BANG in places with no dependenies that don't get rebuilt runLogic [] $ logic ++ [Want [i | Logic i _ <- logic]] where runLogic :: [Int] -> [Logic] -> IO () runLogic negated xs = do let poss = [i | Logic i _ <- xs] i <- randomRIO (0, 7) wants <- replicateM i $ do i <- randomRIO (0, 5) replicateM i $ randomElem poss sleepFileTime j <- randomRIO (1::Int,8) build $ ("-j" ++ show j) : map ((++) "--arg=" . show) (xs ++ map Want wants) let value i = case [ys | Logic j ys <- xs, j == i] of [ys] -> Multiple $ flip map ys $ map $ \i -> case i of Input i -> Single $ if i `elem` negated then negate i else i Output i -> value i forM_ (concat wants) $ \i -> do let wanted = value i got <- fmap read $ IO.readFile' $ "output-" ++ show i ++ ".txt" when (wanted /= got) $ error $ "INCORRECT VALUE for " ++ show i addBang :: [Logic] -> IO [Logic] addBang xs = do i <- randomRIO (0, length xs - 1) let (before,now:after) = splitAt i xs now <- f now return $ before ++ now : after where f (Logic log xs) = do i <- randomRIO (0, length xs) let (before,after) = splitAt i xs return $ Logic log $ before ++ [Bang] : after randomLogic :: IO [Logic] -- only Logic constructors randomLogic = do rules <- randomRIO (1,100) f rules $ map Input inputRange where f 0 avail = return [] f i avail = do needs <- randomRIO (0,3) xs <- replicateM needs $ do ns <- randomRIO (0,3) replicateM ns $ randomElem avail let r = Logic i xs (r:) <$> f (i-1) (Output i:avail) shake-0.16.4/src/Test/Progress.hs0000644000000000000000000000407113261223301014724 0ustar0000000000000000module Test.Progress(main) where import Development.Shake.Internal.Progress import Test.Type import System.Directory.Extra import System.FilePath import Data.Monoid import Prelude main = shakeTest_ test $ return () -- | Given a list of todo times, get out a list of how long is predicted prog = progEx 10000000000000000 progEx :: Double -> [Double] -> IO [Double] progEx mxDone todo = do let resolution = 10000 -- Use resolution to get extra detail on the numbers let done = scanl (+) 0 $ map (min mxDone . max 0) $ zipWith (-) todo (tail todo) let res = progressReplay $ zip (map (*resolution) [1..]) $ tail $ zipWith (\t d -> mempty{timeBuilt=d*resolution,timeTodo=(t*resolution,0)}) todo done return $ (0/0) : map ((/ resolution) . actualSecs) res test build = do -- perfect functions should match perfectly xs <- prog [10,9..1] drop 2 xs === [8,7..1] xs <- prog $ map (*5) [10,9..1] drop 2 xs === [8,7..1] xs <- prog $ map (*0.2) [10,9..1] let dp3 x = fromIntegral (round $ x * 1000 :: Int) / 1000 map dp3 (drop 2 xs) === [8,7..1] -- The properties below this line could be weakened -- increasing functions can't match xs <- prog [5,6,7] last xs === 7 -- the first value must be plausible, or missing xs <- prog [187] assertBool (isNaN $ head xs) "No first value" -- desirable properties, could be weakened xs <- progEx 2 $ 100:map (*2) [10,9..1] drop 5 xs === [6,5..1] xs <- progEx 1 [10,9,100,8,7,6,5,4,3,2,1] assertBool (all (<= 1.5) $ map abs $ zipWith (-) (drop 5 xs) [6,5..1]) "Close" -- if no progress is made, don't keep the time going up xs <- prog [10,9,8,7,7,7,7,7] drop 5 xs === [7,7,7] -- if the work rate changes, should somewhat reflect that xs <- prog [10,9,8,7,6.5,6,5.5,5] assertBool (last xs > 7.1) "Some discounting (factor=0 would give 7)" xs <- listFiles $ root "src/Test/Progress" build $ ["--progress=replay=" ++ x | x <- xs, takeExtension x == ".prog"] ++ ["--no-report","--report=-","--report=" ++ "progress.html"] shake-0.16.4/src/Test/Pool.hs0000644000000000000000000000632013261223301014030 0ustar0000000000000000 module Test.Pool(main) where import Test.Type import Development.Shake.Internal.Core.Pool import Control.Concurrent.Extra import Control.Exception.Extra import Control.Monad main = shakeTest_ test $ return () test build = do -- See #474, we should never be running pool actions masked let addPool pool act = addPoolStart pool $ do Unmasked <- getMaskingState act forM_ [False,True] $ \deterministic -> do -- check that it aims for exactly the limit forM_ [1..6] $ \n -> do var <- newMVar (0,0) -- (maximum, current) runPool deterministic n $ \pool -> forM_ [1..5] $ \i -> addPool pool $ do modifyMVar_ var $ \(mx,now) -> return (max (now+1) mx, now+1) -- requires that all tasks get spawned within 0.1s sleep 0.1 modifyMVar_ var $ \(mx,now) -> return (mx,now-1) res <- takeMVar var res === (min n 5, 0) -- check that exceptions are immediate good <- newVar True started <- newBarrier stopped <- newBarrier res <- try_ $ runPool deterministic 3 $ \pool -> do addPool pool $ do waitBarrier started error "pass" addPool pool $ flip finally (signalBarrier stopped ()) $ do signalBarrier started () sleep 10 modifyVar_ good $ const $ return False -- note that the pool finishing means we started killing our threads -- not that they have actually died case res of Left e | Just (ErrorCall "pass") <- fromException e -> return () _ -> fail $ "Wrong type of result, got " ++ show res waitBarrier stopped assertBoolIO (readVar good) "Must be true" -- check someone spawned when at zero todo still gets run done <- newBarrier runPool deterministic 1 $ \pool -> addPool pool $ addPool pool $ signalBarrier done () assertWithin 1 $ waitBarrier done -- check high priority stuff runs first res <- newVar "" runPool deterministic 1 $ \pool -> do let note c = modifyVar_ res $ return . (c:) -- deliberately in a random order addPoolBatch pool $ note 'b' addPoolException pool $ note 'e' addPoolStart pool $ note 's' addPoolStart pool $ note 's' addPoolResume pool $ note 'r' addPoolException pool $ note 'e' (=== "bssree") =<< readVar res -- check that killing a thread pool stops the tasks, bug 545 thread <- newBarrier died <- newBarrier done <- newBarrier t <- forkIO $ flip finally (signalBarrier died ()) $ runPool deterministic 1 $ \pool -> addPool pool $ flip onException (signalBarrier done ()) $ do killThread =<< waitBarrier thread sleep 10 signalBarrier thread t assertWithin 1 $ waitBarrier done assertWithin 1 $ waitBarrier died shake-0.16.4/src/Test/Parallel.hs0000644000000000000000000000307013261223301014652 0ustar0000000000000000 module Test.Parallel(main) where import Development.Shake import Test.Type import Data.Tuple.Extra import Control.Monad import Control.Concurrent.Extra import Data.IORef main = shakeTest_ test $ do "AB.txt" %> \out -> do -- need [obj "A.txt", obj "B.txt"] (text1,text2) <- readFile' "A.txt" `par` readFile' "B.txt" writeFile' out $ text1 ++ text2 phony "cancel" $ do writeFile' "cancel" "" done <- liftIO $ newIORef 0 lock <- liftIO newLock void $ parallel $ replicate 5 $ liftIO $ do x <- atomicModifyIORef done $ dupe . succ when (x == 3) $ do sleep 0.1; fail "boom" withLock lock $ appendFile "cancel" "x" phony "parallel" $ do active <- liftIO $ newIORef 0 peak <- liftIO $ newIORef 0 void $ parallel $ replicate 8 $ liftIO $ do now <- atomicModifyIORef active $ dupe . succ atomicModifyIORef peak $ dupe . max now sleep 0.1 atomicModifyIORef active $ dupe . pred peak <- liftIO $ readIORef peak writeFile' "parallel" $ show peak test build = do writeFile "A.txt" "AAA" writeFile "B.txt" "BBB" build ["AB.txt","--sleep"] assertContents "AB.txt" "AAABBB" appendFile "A.txt" "aaa" build ["AB.txt"] assertContents "AB.txt" "AAAaaaBBB" assertException ["boom"] $ build ["cancel","-j1","--quiet"] assertContents "cancel" "xx" build ["parallel","-j1"] assertContents "parallel" "1" build ["parallel","-j5"] assertContents "parallel" "5" shake-0.16.4/src/Test/OrderOnly.hs0000644000000000000000000000302713261223301015035 0ustar0000000000000000 module Test.OrderOnly(main) where import Development.Shake import Test.Type import System.Directory(removeFile) import Control.Exception.Extra main = shakeTest_ test $ do "bar.txt" %> \out -> do alwaysRerun writeFile' out =<< liftIO (readFile "bar.in") "foo.txt" %> \out -> do let src = "bar.txt" orderOnly [src] writeFile' out =<< liftIO (readFile src) need [src] "baz.txt" %> \out -> do let src = "bar.txt" orderOnly [src] liftIO $ appendFile out "x" "primary.txt" %> \out -> do need ["source.txt"] orderOnly ["intermediate.txt"] writeFile' out =<< liftIO (readFile "intermediate.txt") "intermediate.txt" %> \out -> copyFile' "source.txt" out test build = do writeFile "bar.in" "in" build ["foo.txt","--sleep"] assertContents "foo.txt" "in" writeFile "bar.in" "out" build ["foo.txt","--sleep"] assertContents "foo.txt" "out" writeFile "baz.txt" "" writeFile "bar.in" "in" build ["baz.txt","--sleep"] assertContents "baz.txt" "x" writeFile "bar.in" "out" build ["baz.txt"] assertContents "baz.txt" "x" ignore $ removeFile "intermediate.txt" writeFile "source.txt" "x" build ["primary.txt","--sleep"] assertContents "intermediate.txt" "x" removeFile "intermediate.txt" build ["primary.txt","--sleep"] assertMissing "intermediate.txt" writeFile "source.txt" "y" build ["primary.txt","--sleep"] assertContents "intermediate.txt" "y" shake-0.16.4/src/Test/Oracle.hs0000644000000000000000000000777413261223301014342 0ustar0000000000000000{-# LANGUAGE TypeFamilies, ConstraintKinds #-} {-# LANGUAGE DeriveDataTypeable, GeneralizedNewtypeDeriving #-} module Test.Oracle(main) where import Development.Shake import Development.Shake.Classes import Development.Shake.FilePath import General.GetOpt import Data.List.Extra import Data.Tuple.Extra import Test.Type hiding (RandomType) import qualified Test.Type as T import Control.Monad import Data.Functor import Prelude -- These are instances we'll compute over type instance RuleResult String = String type instance RuleResult Int = String type instance RuleResult () = String type instance RuleResult Bool = Bool -- test results don't have to be a boolean newtype FileLen = FileLen FilePath deriving (Eq,Show,NFData,Typeable,Hashable,Binary) type instance RuleResult FileLen = Int newtype RandomType = RandomType (BinarySentinel String) deriving (Eq,Show,NFData,Typeable,Hashable,Binary) type instance RuleResult RandomType = Int type instance RuleResult T.RandomType = Int data Define = Define String String -- this type produces this result opt = [Option "" ["def"] (ReqArg (Right . uncurry Define . second tail . breakOn "=") "type=value") ""] main = shakeTest test opt $ \args -> do addOracle $ \(T.RandomType _) -> return 42 addOracle $ \(RandomType _) -> return (-42) "randomtype.txt" %> \out -> do a <- askOracle $ T.RandomType $ BinarySentinel () b <- askOracle $ RandomType $ BinarySentinel () writeFile' out $ show (a,b) addOracle $ \b -> return $ not b "true.txt" %> \out -> writeFile' out . show =<< askOracle False let add :: (ShakeValue a, RuleResult a ~ String) => String -> a -> Rules () add name key = do name <.> "txt" %> \out -> do liftIO $ appendFile ".log" "." writeFile' out =<< askOracle key forM_ [val | Define nam val <- args, nam == name] $ \val -> addOracle $ \k -> let _ = k `asTypeOf` key in return val add "string" "" add "unit" () add "int" (0 :: Int) addOracleCache $ \(FileLen file) -> do liftIO $ appendFile ".log" "#" length <$> readFile' file "cache_out.txt" %> \out -> do o <- askOracle (FileLen "cache_in.txt") liftIO $ appendFile ".log" "!" writeFile' out $ show o test build = do build ["clean"] build ["randomtype.txt"] assertContents "randomtype.txt" "(42,-42)" -- check it rebuilds when it should writeFile ".log" "" build ["--def=string=name","string.txt"] assertContents "string.txt" "name" build ["--def=string=name","string.txt"] assertContents "string.txt" "name" build ["--def=string=test","string.txt"] assertContents "string.txt" "test" assertContents ".log" ".." -- check adding/removing redundant oracles does not trigger a rebuild build ["--def=string=test","string.txt","--def=unit=bob"] build ["--def=string=test","string.txt","--def=int=fred"] build ["--def=string=test","string.txt"] assertContents "string.txt" "test" assertContents ".log" ".." writeFile ".log" "" writeFile "cache_in.txt" "aaa" build ["cache_out.txt","--sleep"] assertContents "cache_out.txt" "3" assertContents ".log" "#!" writeFile "cache_in.txt" "zzz" build ["cache_out.txt","--sleep"] build ["cache_out.txt","--sleep"] assertContents "cache_out.txt" "3" assertContents ".log" "#!#" writeFile "cache_in.txt" "zzzz" build ["cache_out.txt","--sleep"] assertContents "cache_out.txt" "4" assertContents ".log" "#!##!" -- check error messages are good let errors args err = assertException [err] $ build $ "--quiet" : args build ["--def=unit=test","unit.txt"] errors ["unit.txt"] -- Building with an an Oracle that has been removed "missing a call to addOracle" errors ["int.txt"] -- Building with an Oracle that I know nothing about "missing a call to addOracle" errors ["--def=string=1","--def=string=1"] -- Two Oracles defined in one go "oracle defined twice" shake-0.16.4/src/Test/Ninja.hs0000644000000000000000000001107013261223301014154 0ustar0000000000000000 module Test.Ninja(main) where import Development.Shake import qualified Development.Shake.Config as Config import System.Directory(copyFile, removeFile) import Control.Applicative import Control.Monad import General.GetOpt import General.Extra import Test.Type import qualified Data.HashMap.Strict as Map import Data.List import Data.Maybe import System.IO.Extra import qualified Run import System.Environment import Prelude opts = Option "" ["arg"] (ReqArg Right "") "" main = shakeTest test [opts] $ \opts -> do let real = "real" `elem` opts action $ if real then cmd "ninja" opts else liftIO $ withArgs ("--lint":"--report=report.html":opts) Run.main test build = do let runEx ninja shake = build $ "--exception" : "--no-report" : map ("--arg=" ++) (words ninja) ++ words shake let run ninja = runEx ninja [] let runFail ninja bad = assertException [bad] $ runEx ninja "--quiet" build ["clean"] run "-f../../src/Test/Ninja/test1.ninja" assertExists "out1.txt" run "-f../../src/Test/Ninja/test2.ninja" assertExists "out2.2" assertMissing "out2.1" build ["clean"] run "-f../../src/Test/Ninja/test2.ninja out2.1" assertExists "out2.1" assertMissing "out2.2" copyFile "../../src/Test/Ninja/test3-sub.ninja" "test3-sub.ninja" copyFile "../../src/Test/Ninja/test3-inc.ninja" "test3-inc.ninja" createDirectoryRecursive "subdir" copyFile "../../src/Test/Ninja/subdir/1.ninja" "subdir/1.ninja" copyFile "../../src/Test/Ninja/subdir/2.ninja" "subdir/2.ninja" run "-f../../src/Test/Ninja/test3.ninja" assertContentsWords "out3.1" "g4+b1+++i1" assertContentsWords "out3.2" "g4++++i1" assertContentsWords "out3.3" "g4++++i1" assertContentsWords "out3.4" "g4+++s1+s2" run "-f../../src/Test/Ninja/test4.ninja out" assertExists "out.txt" assertExists "out2.txt" run "-f../../src/Test/Ninja/test5.ninja" assertExists "output file" run "-f../../src/Test/Ninja/test7.ninja" writeFile "nocreate.log" "" writeFile "nocreate.in" "" run "-f../../src/Test/Ninja/nocreate.ninja" assertContentsWords "nocreate.log" "x" run "-f../../src/Test/Ninja/nocreate.ninja" run "-f../../src/Test/Ninja/nocreate.ninja" assertContentsWords "nocreate.log" "x x x" writeFile "input" "" runFail "-f../../src/Test/Ninja/lint.ninja bad --lint" "'needed' file required rebuilding" run "-f../../src/Test/Ninja/lint.ninja good --lint" runFail "-f../../src/Test/Ninja/lint.ninja bad --lint" "not a pre-dependency" res <- fmap (drop 1 . lines . fst) $ captureOutput $ runEx "-f../../src/Test/Ninja/compdb.ninja -t compdb cxx" "--quiet" want <- lines <$> readFile "../../src/Test/Ninja/compdb.output" let eq a b | (a1,'*':a2) <- break (== '*') a = unless (a1 `isPrefixOf` b && a2 `isSuffixOf` b) $ a === b | otherwise = a === b length want === length res zipWithM_ eq want res -- Test initial variable bindings and variables in include/subninja statements let test6 = "test6" copyFile "../../src/Test/Ninja/test6-sub.ninja" $ test6 ++ "-sub.ninja" copyFile "../../src/Test/Ninja/test6-inc.ninja" $ test6 ++ "-inc.ninja" copyFile "../../src/Test/Ninja/test6.ninja" $ test6 ++ ".ninja" config <- Config.readConfigFileWithEnv [("v1", test6)] $ test6 ++ ".ninja" -- The file included by subninja should have a separate variable scope Map.lookup "v2" config === Just "g2" run "-f../../src/Test/Ninja/phonyorder.ninja bar.txt" -- tests from ninjasmith: https://github.com/ndmitchell/ninjasmith/ run "-f../../src/Test/Ninja/redefine.ninja" assertContentsWords "redefine.txt" "version3 version2" run "-f../../src/Test/Ninja/buildseparate.ninja" assertContentsWords "buildseparate.txt" "XX" run "-f../../src/Test/Ninja/lexical.ninja" assertContentsWords "lexical.txt" "XFoo_BarXXFooX.bar" run "-f../../src/Test/Ninja/continuations.ninja" assertExists "continuations.txt" copyFile "../../src/Test/Ninja/restart.ninja" "restart.ninja" runEx "-frestart.ninja" "--sleep" assertExists "restart.txt" when False $ do -- currently fails because Shake doesn't match Ninja here run "-f../../src/Test/Ninja/outputtouch.ninja" assertContentsWords "outputtouch.txt" "hello" writeFile "outputtouch.txt" "goodbye" run "-f../../src/Test/Ninja/outputtouch.ninja" assertContentsWords "outputtouch.txt" "goodbye" removeFile "outputtouch.txt" run "-f../../src/Test/Ninja/outputtouch.ninja" assertContentsWords "outputtouch.txt" "hello" shake-0.16.4/src/Test/Monad.hs0000644000000000000000000000545213261223301014162 0ustar0000000000000000 module Test.Monad(main) where import Test.Type import Development.Shake.Internal.Core.Monad import Data.IORef import Control.Concurrent import Control.Exception import Control.Monad import Control.Monad.IO.Class main = shakeTest_ test $ return () run :: ro -> rw -> RAW ro rw a -> IO a run ro rw m = do res <- newEmptyMVar runRAW ro rw m $ void . tryPutMVar res either throwIO return =<< readMVar res test build = do let conv x = either (Left . fromException) Right x :: Either (Maybe ArithException) Int let dump rw = liftIO . (=== rw) =<< getRW -- test the basics plus exception handling run 1 "test" $ do dump "test" putRW "more" dump "more" res <- tryRAW $ do dump "more" modifyRW (++ "x") dump "morex" return 100 liftIO $ conv res === Right 100 dump "morex" putRW "new" dump "new" res <- tryRAW $ do dump "new" modifyRW (++ "z") dump "newz" throwRAW Overflow error "Should not have reached here" return 9 liftIO $ conv res === Left (Just Overflow) dump "newz" catchRAW (catchRAW (throwRAW Overflow) $ \_ -> modifyRW (++ "x")) $ \_ -> modifyRW (++ "y") dump "newzx" catchRAW (catchRAW (throwRAW Overflow) $ \e -> modifyRW (++ "x") >> throwRAW e) $ \_ -> modifyRW (++ "y") dump "newzxxy" -- test capture run 1 "test" $ do i <- captureRAW $ \k -> k $ Right 1 liftIO $ i === 1 i <- tryRAW $ captureRAW $ \k -> k $ Left $ toException Overflow liftIO $ conv i === Left (Just Overflow) captureRAW $ \k -> k $ Right () i <- tryRAW $ throwRAW Underflow liftIO $ conv i === Left (Just Underflow) -- catch does not scope too far res <- try $ run 1 "test" $ fmap (either show id) $ tryRAW $ captureRAW $ \k -> throwIO Overflow res === Left Overflow res <- try $ run 1 "test" $ do captureRAW $ \k -> throwIO Overflow return "x" res === Left Overflow -- test for GHC bug 11555 runRAW 1 "test" (throw Overflow :: RAW Int String ()) $ \res -> either (Left . fromException) Right res === Left (Just Overflow) -- catch works properly if continuation called multiple times ref <- newIORef [] run 1 "test" $ flip catchRAW (const $ liftIO $ modifyIORef ref ('x':)) $ do captureRAW $ \k -> do k $ Right () k $ Right () k $ Left $ toException Overflow k $ Right () k $ Left $ toException Overflow flip catchRAW (const $ liftIO $ modifyIORef ref ('y':)) $ throwRAW $ toException Overflow (===) "xyxyy" =<< readIORef ref shake-0.16.4/src/Test/Match.hs0000644000000000000000000000333413261223301014155 0ustar0000000000000000 -- | Test the rule matching facilities - alternatives, priority etc. module Test.Match(main) where import Development.Shake import Test.Type main = shakeTest_ test $ do let output x file = writeFile' file x ["or*","*or"] |%> output "" alternatives $ do "alternative.t*" %> output "alternative.t*" "alternative.*" %> output "alternative.*" priority 100 $ priority 0 $ "priority.txt" %> output "100" priority 50 $ "priority.txt" %> output "50" alternatives $ do priority 20 $ "altpri.txt" %> output "20" priority 40 $ "altpri.txt" %> output "40" priority 30 $ "altpri.txt" %> output "30" alternatives $ do priority 21 $ "altpri2.txt" %> output "21" priority 22 $ "altpri2.txt" %> output "22" priority 23 $ "altpri2.txt" %> output "23" priority 55 $ alternatives $ "x" %> output "55" priority 51 $ "x" %> output "51" priority 42 $ alternatives $ "xx" %> output "42" priority 43 $ "xx" %> output "43" priority 10 $ do priority 6 $ "change" %> output "6" priority 7 $ "change" %> output "7" priority 8 $ "change" %> output "8" priority 9 $ "change" %> output "9" test build = do build ["clean"] build ["or"] build ["alternative.foo","alternative.txt"] assertContents "alternative.foo" "alternative.*" assertContents "alternative.txt" "alternative.t*" build ["priority.txt"] assertContents "priority.txt" "100" build ["altpri.txt","altpri2.txt"] assertContents "altpri.txt" "20" assertContents "altpri2.txt" "23" build ["x","xx"] assertContents "x" "55" assertContents "xx" "43" assertException ["matches multiple rules","3"] $ build ["change","--quiet"] shake-0.16.4/src/Test/Manual.hs0000644000000000000000000000214113261223301014331 0ustar0000000000000000 module Test.Manual(main) where import Development.Shake hiding (copyFileChanged) import Development.Shake.FilePath import Test.Type import General.Extra import Data.Maybe import System.Info.Extra main = shakeTest_ test $ action $ fail "The 'manual' example should only be used in test mode" test build = do -- we use .git as our destination, despite not being a real git repo -- so that search tools ignore it, and I don't get dupes for every source file let dest = ".git" copyDirectoryChanged (root "docs/manual") dest copyDirectoryChanged (root "src/Development") $ dest "Development" copyDirectoryChanged (root "src/General") $ dest "General" copyFileChanged (root "src/Paths.hs") $ dest "Paths_shake.hs" (_, gccPath) <- findGcc let opts = [Cwd dest, Shell, AddPath [] (maybeToList gccPath)] let cmdline = if isWindows then "build.bat" else "/bin/sh build.sh" cmd_ opts cmdline "-j2" assertExists $ dest "_build/run" <.> exe cmd_ opts cmdline cmd_ opts [cmdline,"clean"] assertMissing $ dest "_build/run" <.> exe shake-0.16.4/src/Test/Live.hs0000644000000000000000000000125013261223301014013 0ustar0000000000000000 module Test.Live(main) where import Development.Shake import Test.Type main = shakeTest_ test $ do "foo" %> \ out -> do need ["bar"] writeFile' out "" "bar" %> \out -> writeFile' out "" "baz" %> \out -> writeFile' out "" test build = do build ["clean"] build ["foo","baz","--live=live.txt"] assertContentsUnordered "live.txt" $ words "foo bar baz" build ["foo","baz","--live=live.txt"] assertContentsUnordered "live.txt" $ words "foo bar baz" build ["foo","--live=live.txt"] assertContentsUnordered "live.txt" $ words "foo bar" build ["bar","--live=live.txt"] assertContentsUnordered "live.txt" $ words "bar" shake-0.16.4/src/Test/Lint.hs0000644000000000000000000001062213261223301014025 0ustar0000000000000000{-# LANGUAGE TypeFamilies, GeneralizedNewtypeDeriving, DeriveDataTypeable #-} module Test.Lint(main) where import Development.Shake import Development.Shake.Classes import Development.Shake.FilePath import General.Extra import Test.Type import Control.Exception import System.Directory as IO import System.Info.Extra import Control.Monad.Extra newtype Zero = Zero () deriving (Eq, Show, NFData, Typeable, Hashable, Binary) type instance RuleResult Zero = Zero main = shakeTest_ test $ do addOracle $ \Zero{} -> do liftIO $ createDirectoryRecursive "dir" liftIO $ setCurrentDirectory "dir" return $ Zero () "changedir" %> \out -> do Zero () <- askOracle $ Zero () writeFile' out "" "pause.*" %> \out -> do liftIO $ sleep 0.1 need ["cdir" <.> takeExtension out] writeFile' out "" "cdir.*" %> \out -> do pwd <- liftIO getCurrentDirectory let dir2 = "dir" ++ takeExtension out liftIO $ createDirectoryRecursive dir2 liftIO $ setCurrentDirectory dir2 liftIO $ sleep 0.2 liftIO $ setCurrentDirectory pwd writeFile' out "" "createonce" %> \out -> writeFile' out "X" "createtwice" %> \out -> do need ["createonce"] liftIO sleepFileTime writeFile' "createonce" "Y" writeFile' out "" "listing" %> \out -> do writeFile' (out <.> "ls1") "" getDirectoryFiles "" ["//*.ls*"] writeFile' (out <.> "ls2") "" writeFile' out "" "existance" %> \out -> do Development.Shake.doesFileExist "exists" writeFile' "exists" "" writeFile' out "" "gen*" %> \out -> writeFile' out out "needed1" %> \out -> do needed ["gen1"] writeFile' out "" "needed2" %> \out -> do orderOnly ["gen2"] needed ["gen2"] writeFile' out "" "tracker-write1" %> \out -> do gen "x" $ out <.> "txt" need [out <.> "txt"] writeFile' out "" "tracker-write2" %> \out -> do gen "x" $ out <.> "txt" writeFile' out "" "tracker-source2" %> \out -> copyFile' "tracker-source1" out "tracker-read1" %> \out -> do access "tracker-source1" writeFile' out "" "tracker-read2" %> \out -> do access "tracker-source1" need ["tracker-source1"] writeFile' out "" "tracker-read3" %> \out -> do access "tracker-source2" need ["tracker-source2"] writeFile' out "" "tracker-compile.o" %> \out -> do need ["tracker-source.c", "tracker-source.h"] cmd "gcc" ["-c", "tracker-source.c", "-o", out] "tracker-compile-auto.o" %> \out -> do need ["tracker-source.c"] cmd AutoDeps "gcc" ["-c", "tracker-source.c", "-o", out] where gen t f = cmd Shell "echo" t ">" (toNative f) :: Action () access f = if isWindows then cmd_ Shell "type" (toNative f) "> nul" else cmd_ Shell "cat" f "> /dev/null" test build = do dir <- getCurrentDirectory let crash args parts = assertException parts (build $ "--quiet" : args) `finally` setCurrentDirectory dir crash ["changedir"] ["current directory has changed"] build ["cdir.1","cdir.2","-j1"] build ["--clean","cdir.1","pause.2","-j1"] crash ["--clean","cdir.1","pause.2","-j2"] ["output","lint","current directory has changed"] crash ["existance"] ["changed since being depended upon"] crash ["createtwice"] ["changed since being depended upon"] crash ["listing"] ["changed since being depended upon","listing.ls2"] crash ["--clean","listing","existance"] ["changed since being depended upon"] crash ["needed1"] ["'needed' file required rebuilding"] build ["needed2"] whenM hasTracker $ do writeFile "tracker-source1" "" writeFile "tracker-source2" "" writeFile "tracker-source.c" "#include \n#include \"tracker-source.h\"\n" writeFile "tracker-source.h" "" crash ["tracker-write1"] ["not have its creation tracked","tracker-write1","tracker-write1.txt"] build ["tracker-write2"] crash ["tracker-read1"] ["used but not depended upon","tracker-source1"] build ["tracker-read2"] crash ["tracker-read3"] ["depended upon after being used","tracker-source2"] build ["tracker-compile.o"] build ["tracker-compile-auto.o"] shake-0.16.4/src/Test/Journal.hs0000644000000000000000000000162013261223301014527 0ustar0000000000000000 module Test.Journal(main) where import Control.Monad import Data.IORef import Development.Shake import Development.Shake.FilePath import Test.Type import System.IO.Unsafe {-# NOINLINE rebuilt #-} rebuilt :: IORef Int rebuilt = unsafePerformIO $ newIORef 0 main = shakeTest_ test $ do want ["a.out","b.out","c.out"] "*.out" %> \out -> do liftIO $ atomicModifyIORef rebuilt $ \a -> (a+1,()) copyFile' (out -<.> "in") out test build = do let change x = writeFile (x <.> "in") x let count x = do before <- readIORef rebuilt build ["--sleep"] after <- readIORef rebuilt x === after - before change "a" change "b" change "c" count 3 -- test that compressing the database doesn't corrupt anything replicateM_ 4 $ do change "a" count 1 change "a" change "c" count 2 shake-0.16.4/src/Test/Forward.hs0000644000000000000000000000107013261223301014520 0ustar0000000000000000 module Test.Forward(main) where import Development.Shake import Development.Shake.Forward import Development.Shake.FilePath import Control.Monad.Extra import Test.Type main = shakeTest_ test $ forwardRule $ do let src = root "src/Test/C" cs <- getDirectoryFiles src ["*.c"] os <- forP cs $ \c -> do let o = c <.> "o" cache $ cmd "gcc -c" [src c] "-o" [o] return o cache $ cmd "gcc -o" ["Main" <.> exe] os test build = whenM hasTracker $ do build ["--forward","--clean"] build ["--forward","-j2"] shake-0.16.4/src/Test/Files.hs0000644000000000000000000000307213261223301014162 0ustar0000000000000000 module Test.Files(main) where import Development.Shake import Development.Shake.FilePath import System.Directory import Test.Type import Control.Monad import Data.List main = shakeTest test [] $ \opts -> do want ["even.txt","odd.txt"] "A1-plus-B" %> \out -> do a1 <- readFileLines "A1" b <- readFileLines "B" writeFileLines out $ a1 ++ b ["A1", "A2"] &%> \[o1, o2] -> do writeFileLines o1 ["This is", "A1"] writeFileLines o2 ["This is", "A2"] "B" %> \out -> writeFileLines out ["This is", "B"] ["even.txt","odd.txt"] &?%> \[evens,odds] -> do src <- readFileLines "numbers.txt" let (es,os) = partition even $ map read src writeFileLines evens $ map show es writeFileLines odds $ map show os ["dir1/out.txt","dir2/out.txt"] &?%> \[a,b] -> do writeFile' a "a" writeFile' b "b" (\x -> let dir = takeDirectory x in if takeFileName dir /= "pred" then Nothing else Just [dir "a.txt",dir "b.txt"]) &?> \outs -> mapM_ (`writeFile'` "") outs test build = do forM_ [[],["--usepredicate"]] $ \args -> do let nums = unlines . map show writeFile "numbers.txt" $ nums [1,2,4,5,2,3,1] build ("--sleep":args) assertContents "even.txt" $ nums [2,4,2] assertContents "odd.txt" $ nums [1,5,3,1] build ["clean"] build ["--no-build","--report=-"] build ["dir1/out.txt"] build ["pred/a.txt"] -- Test #496 build ["A1-plus-B"] removeFile "A2" build ["A1-plus-B"] shake-0.16.4/src/Test/FilePattern.hs0000644000000000000000000002024713261223301015340 0ustar0000000000000000 module Test.FilePattern(main) where import Development.Shake.Internal.FilePattern import Development.Shake.FilePath import Control.Monad import System.IO.Unsafe import System.Info.Extra import Data.List.Extra import Test.Type import Test.QuickCheck hiding ((===)) main = shakeTest_ test $ return () newtype Pattern = Pattern FilePattern deriving (Show,Eq) newtype Path = Path FilePath deriving (Show,Eq) -- Since / and * are the only "interesting" elements, just add ab to round out the set instance Arbitrary Pattern where arbitrary = fmap Pattern $ listOf $ elements "\\/*ab" shrink (Pattern x) = map Pattern $ shrinkList (\x -> ['/' | x == '\\']) x instance Arbitrary Path where arbitrary = fmap Path $ listOf $ elements "\\/ab" shrink (Path x) = map Path $ shrinkList (\x -> ['/' | x == '\\']) x test build = do internalTest let norm = filter (/= ".") . split isPathSeparator let f b pat file = do assertBool (b == (pat ?== file)) $ show pat ++ " ?== " ++ show file ++ "\nEXPECTED: " ++ show b assertBool (b == (pat `walker` file)) $ show pat ++ " `walker` " ++ show file ++ "\nEXPECTED: " ++ show b when b $ assertBool (norm (substitute (extract pat file) pat) == norm file) $ "FAILED substitute/extract property\nPattern: " ++ show pat ++ "\nFile: " ++ show file ++ "\n" ++ "Extracted: " ++ show (extract pat file) ++ "\nSubstitute: " ++ show (substitute (extract pat file) pat) f True "//*.c" "foo/bar/baz.c" f True "**/*.c" "foo/bar/baz.c" f True (toNative "//*.c") "foo/bar\\baz.c" f True (toNative "**/*.c") "foo/bar\\baz.c" f True "*.c" "baz.c" f True "//*.c" "baz.c" f True "**/*.c" "baz.c" f True "test.c" "test.c" f False "*.c" "foor/bar.c" f False "*/*.c" "foo/bar/baz.c" f False "foo//bar" "foobar" f False "foo/**/bar" "foobar" f False "foo//bar" "foobar/bar" f False "foo/**/bar" "foobar/bar" f False "foo//bar" "foo/foobar" f False "foo/**/bar" "foo/foobar" f True "foo//bar" "foo/bar" f True "foo/**/bar" "foo/bar" f True "foo/bar" (toNative "foo/bar") f True (toNative "foo/bar") "foo/bar" f True (toNative "foo/bar") (toNative "foo/bar") f True "//*" "/bar" f False "**/*" "/bar" f True "/bob//foo" "/bob/this/test/foo" f True "/bob/**/foo" "/bob/this/test/foo" f False "/bob//foo" "bob/this/test/foo" f False "/bob/**/foo" "bob/this/test/foo" f True "bob//foo/" "bob/this/test/foo/" f True "bob/**/foo/" "bob/this/test/foo/" f False "bob//foo/" "bob/this/test/foo" f False "bob/**/foo/" "bob/this/test/foo" f True "a//" "a" f True "a/**" "a" f True "/a//" "/a" f True "/a/**" "/a" f True "///a//" "/a" f False "**/a/**" "/a" f False "///" "" f True "///" "/" f True "/**" "/" f True "///" "a/" f True "**/" "a/" f True "////" "" f True "**/**" "" f True "x///y" "x/y" f True "x/**/y" "x/y" f True "x///" "x/" f True "x/**/" "x/" f True "x///" "x/foo/" f True "x/**/" "x/foo/" f False "x///" "x" f False "x/**/" "x" f True "x///" "x/foo/bar/" f True "x/**/" "x/foo/bar/" f False "x///" "x/foo/bar" f False "x/**/" "x/foo/bar" f True "x///y" "x/z/y" f True "x/**/*/y" "x/z/y" f True "" "" f False "" "y" f False "" "/" f True "*/*" "x/y" f False "*/*" "x" f True "//*" "x" f True "**/*" "x" f True "//*" "" f True "**/*" "" f True "*//" "x" f True "*/**" "x" f True "*//" "" f True "*/**" "" f True "*//*" "x/y" f True "*/**/*" "x/y" f False "*//*" "" f False "*/**/*" "" f False "*//*" "x" f False "*/**/*" "x" f False "*//*//*" "x/y" f False "*/**/*/**/*" "x/y" f True "//*/" "/" f False "**/*/" "/" f True "*/////" "/" f True "*/**/**/" "/" f False "b*b*b*//" "bb" f False "b*b*b*/**" "bb" f False "**" "/" f False "**/x" "/x" f True "**" "x/" f (not isWindows) "**" "\\\\drive" f (not isWindows) "**" "C:\\drive" f (not isWindows) "**" "C:drive" -- We support ignoring '.' values in FilePath as they are inserted by @filepath@ a lot f True "./file" "file" f True ("" "file") "file" f True "foo/./bar" "foo/bar" f True "foo/./bar" "foo/./bar" f False "foo/./bar" "foo/bob" filePattern "**/*.c" "test.txt" === Nothing filePattern "**/*.c" "foo.c" === Just ["","foo"] filePattern "**/*.c" "bar/baz/foo.c" === Just ["bar/baz/","foo"] filePattern "**/*.c" "bar\\baz\\foo.c" === Just (if isWindows then ["bar/baz/","foo"] else ["","bar\\baz\\foo"]) simple "a*b" === False simple "a//b" === False simple "a/**/b" === False simple "/a/b/cccc_" === True simple "a///b" === False simple "a/**/b" === False assertBool (compatible []) "compatible" assertBool (compatible ["//*a.txt","foo//a*.txt"]) "compatible" assertBool (compatible ["**/*a.txt","foo/**/a*.txt"]) "compatible" assertBool (compatible ["//*a.txt","foo/**/a*.txt"]) "compatible" assertBool (not $ compatible ["//*a.txt","foo//a*.*txt"]) "compatible" assertBool (not $ compatible ["**/*a.txt","foo/**/a*.*txt"]) "compatible" extract "//*a.txt" "foo/bar/testa.txt" === ["foo/bar/","test"] extract "**/*a.txt" "foo/bar/testa.txt" === ["foo/bar/","test"] extract "//*a.txt" "testa.txt" === ["","test"] extract "**/*a.txt" "testa.txt" === ["","test"] extract "//a.txt" "a.txt" === [""] extract "**/a.txt" "a.txt" === [""] extract "//a.txt" "/a.txt" === ["/"] extract "a//b" "a/b" === [""] extract "a/**/b" "a/b" === [""] extract "a//b" "a/x/b" === ["x/"] extract "a/**/b" "a/x/b" === ["x/"] extract "a//b" "a/x/y/b" === ["x/y/"] extract "a/**/b" "a/x/y/b" === ["x/y/"] extract "a///b" "a/x/y/b" === ["x/y/"] extract "a/**/**/b" "a/x/y/b" === ["","x/y/"] extract "//*a*.txt" "testada.txt" === ["","test","da"] extract "**/*a*.txt" "testada.txt" === ["","test","da"] extract (toNative "//*a*.txt") "testada.txt" === ["","test","da"] extract (toNative "**/*a*.txt") "testada.txt" === ["","test","da"] substitute ["","test","da"] "//*a*.txt" === "testada.txt" substitute ["","test","da"] "**/*a*.txt" === "testada.txt" substitute ["foo/bar/","test"] "//*a.txt" === "foo/bar/testa.txt" substitute ["foo/bar/","test"] "**/*a.txt" === "foo/bar/testa.txt" (False, Walk _) <- return $ walk ["*.xml"] (False, Walk _) <- return $ walk ["//*.xml"] (False, Walk _) <- return $ walk ["**/*.xml"] (False, WalkTo ([], [("foo",Walk _)])) <- return $ walk ["foo//*.xml"] (False, WalkTo ([], [("foo",Walk _)])) <- return $ walk ["foo/**/*.xml"] (False, WalkTo ([], [("foo",WalkTo ([],[("bar",Walk _)]))])) <- return $ walk ["foo/bar/*.xml"] (False, WalkTo (["a"],[("b",WalkTo (["c"],[]))])) <- return $ walk ["a","b/c"] ([], [("foo",WalkTo ([],[("bar",Walk _)]))]) <- let (False, Walk f) = walk ["*/bar/*.xml"] in return $ f ["foo"] (False, WalkTo ([],[("bar",Walk _),("baz",Walk _)])) <- return $ walk ["bar/*.xml","baz//*.c"] (False, WalkTo ([],[("bar",Walk _),("baz",Walk _)])) <- return $ walk ["bar/*.xml","baz/**/*.c"] (False, WalkTo ([], [])) <- return $ walk [] (True, Walk _) <- return $ walk ["//"] (True, Walk _) <- return $ walk ["**"] (True, WalkTo _) <- return $ walk [""] Success{} <- quickCheckWithResult stdArgs{maxSuccess=1000} $ \(Pattern p) (Path x) -> let label _ = property in -- Ignore label to workaround QuickCheck space-leak -- See #450 and https://github.com/nick8325/quickcheck/pull/93 let b = p ?== x in (if b then property else label "No match") $ unsafePerformIO $ do f b p x; return True return () walker :: FilePattern -> FilePath -> Bool -- Slight difference of opinion since Walker is always relative to something walker a b | isRelativePattern a, not $ isRelativePath b = False walker a b = f (split isPathSeparator b) $ snd $ walk [a] where f (".":xs) w = f xs w f (x:xs) (Walk op) = f (x:xs) $ WalkTo $ op [x] f [x] (WalkTo (file, dir)) = x `elem` file f (x:xs) (WalkTo (file, dir)) | Just w <- lookup x dir = f xs w f _ _ = False shake-0.16.4/src/Test/FilePath.hs0000644000000000000000000000642013261223301014614 0ustar0000000000000000 module Test.FilePath(main) where import Development.Shake.FilePath import Development.Shake import qualified System.FilePath as Native import Test.Type import Test.QuickCheck import Control.Monad import Data.List import qualified Data.ByteString.Char8 as BS import qualified Development.Shake.Internal.FileName as BS import System.Info.Extra main = shakeTest_ test $ return () newtype File = File String deriving Show instance Arbitrary File where arbitrary = fmap File $ listOf $ oneof $ map return "a /\\:." shrink (File x) = map File $ shrink x test build = do let a === b = a Test.Type.=== b -- duplicate definition in QuickCheck 2.7 and above let norm x = let s = toStandard $ normaliseEx x b = BS.unpack (BS.filepathNormalise $ BS.pack x) in if s == b then s else error $ show ("Normalise functions differ",x,s,b) -- basic examples norm "" === "." norm "." === "." norm "/" === "/" norm "./" === "./" norm "/." === "/." norm "/./" === "/" norm "a/." === "a" norm "./a" === "a" norm "./a/." === "a" norm "./a/./" === "a/" norm "a/.." === "." norm "a/./.." === "." norm "a/../" === "./" norm "/a/../" === "/" norm "/a/./../" === "/" norm "../a" === "../a" norm "/../a/" === "/../a/" -- more realistic examples norm "neil//./test/moo/../bar/bob/../foo" === "neil/test/bar/foo" norm "bar/foo" === "bar/foo" norm "bar/foo/" === "bar/foo/" norm "../../foo" === "../../foo" norm "foo/../..///" === "../" norm "foo/bar/../../neil" === "neil" norm "foo/../bar/../neil" === "neil" norm "/foo/bar" === "/foo/bar" norm "//./" === (if isWindows then "//" else "/") norm "//foo/./bar" === (if isWindows then "//foo/bar" else "/foo/bar") when isWindows $ norm "c:\\foo\\bar" === "c:/foo/bar" when isWindows $ normaliseEx "foo/bar\\baz" === "foo\\bar\\baz" Success{} <- quickCheckWithResult stdArgs{maxSuccess=1000} $ \(File x) -> let y = norm x sep = Native.isPathSeparator noDrive = if isWindows then drop 1 else id ps = [y /= "" ,null x || (sep (head x) == sep (head y) && sep (last x) == sep (last y)) ,not $ "/./" `isInfixOf` y ,not isWindows || '\\' `notElem` y ,not $ "//" `isInfixOf` noDrive y ,".." `notElem` dropWhile (== "..") (dropWhile (\x -> all isPathSeparator x || isDrive x) $ splitDirectories y) ,norm y == y] in if and ps then True else error (show (x, y, ps)) dropDirectory1 "aaa/bbb" === "bbb" dropDirectory1 "aaa/" === "" dropDirectory1 "aaa" === "" dropDirectory1 "" === "" takeDirectory1 "aaa/bbb" === "aaa" takeDirectory1 "aaa/" === "aaa" takeDirectory1 "aaa" === "aaa" searchPathSeparator === Native.searchPathSeparator pathSeparators === Native.pathSeparators if isWindows then toNative "//this is a///test\\here/" === "\\\\this is a\\\\\\test\\here\\" else toNative "//this is a///test\\here/" === "//this is a///test\\here/" -- check common manipulations work ("//*" <.> "foo") === "//*.foo" toStandard ("a" "b" "c" "*" <.> "exe") === "a/b/c//*.exe" ("a" "/b/c") === "a//b/c" shake-0.16.4/src/Test/FileLock.hs0000644000000000000000000000200413261223301014602 0ustar0000000000000000 module Test.FileLock(main) where import Development.Shake import Control.Concurrent.Extra import Control.Exception.Extra import Control.Monad import Data.Either.Extra import System.Time.Extra import System.Info.Extra import Test.Type main = shakeTest_ test $ action $ do putNormal "Starting sleep" liftIO $ sleep 5 putNormal "Finished sleep" -- Disabled under Mac because it fails, see #560 test build = unless isMac $ do -- check it fails exactly once time <- offsetTime lock <- newLock let out msg = do t <- time; withLock lock $ print (t, msg) out "before onceFork" a <- onceFork $ do out "a1"; build ["-VVV"]; out "a2" b <- onceFork $ do out "b1"; build ["-VVV"]; out "b2" out "after onceFork" a <- try_ a out "after try a" b <- try_ b out "after try b" when (length (filter isLeft [a,b]) /= 1) $ fail $ "Expected one success and one failure, got " ++ show [a,b] -- check it succeeds after the lock has been held build [] shake-0.16.4/src/Test/Existence.hs0000644000000000000000000000165013261223301015047 0ustar0000000000000000module Test.Existence(main) where import Development.Shake (getDirectoryFilesIO) import Development.Shake.Internal.FileInfo (getFileInfo) import Development.Shake.Internal.FileName (fileNameFromString) import System.Directory (getCurrentDirectory) import System.FilePath (()) main :: IO () -> IO () main _ = do cwd <- getCurrentDirectory someFiles <- getDirectoryFilesIO cwd ["*"] let someFile = head someFiles assertIsJust . getFileInfo $ fileNameFromString someFile let fileThatCantExist = someFile "fileThatCantExist" assertIsNothing . getFileInfo $ fileNameFromString fileThatCantExist assertIsJust :: IO (Maybe a) -> IO () assertIsJust action = do Just _ <- action return () assertIsNothing :: IO (Maybe a) -> IO () assertIsNothing action = do Nothing <- action return () shake-0.16.4/src/Test/Errors.hs0000644000000000000000000002071213261223301014374 0ustar0000000000000000{-# LANGUAGE TypeFamilies, GeneralizedNewtypeDeriving, DeriveDataTypeable #-} module Test.Errors(main) where import Development.Shake import Development.Shake.Classes import Development.Shake.FilePath import Test.Type import Data.List import Control.Monad import Control.Concurrent import General.GetOpt import General.Extra import Control.Exception.Extra import System.Directory as IO import System.Time.Extra import qualified System.IO.Extra as IO import Data.Functor import Prelude data Args = Die deriving (Eq,Enum,Bounded,Show) newtype BadBinary = BadBinary String deriving (NFData,Show,Eq,Hashable,Typeable) type instance RuleResult BadBinary = BadBinary instance Binary BadBinary where put (BadBinary x) = put x get = do x <- get; if x == "bad" then error "get: BadBinary \"bad\"" else return $ BadBinary x main = shakeTest test optionsEnum $ \args -> do "norule" %> \_ -> need ["norule_isavailable"] "failcreate" %> \_ -> return () ["failcreates", "failcreates2"] &%> \_ -> writeFile' "failcreates" "" "recursive_" %> \out -> need ["intermediate_"] "intermediate_" %> \out -> need ["recursive_"] "rec1" %> \out -> need ["rec2"] "rec2" %> \out -> need ["rec1"] "systemcmd" %> \_ -> cmd "random_missing_command" "stack1" %> \_ -> need ["stack2"] "stack2" %> \_ -> need ["stack3"] "stack3" %> \_ -> error "crash" "staunch1" %> \out -> do liftIO $ sleep 0.1 writeFile' out "test" "staunch2" %> \_ -> error "crash" let catcher out op = out %> \out -> do writeFile' out "0" op $ do src <- IO.readFile' out; writeFile out $ show (read src + 1 :: Int) catcher "finally1" $ actionFinally $ fail "die" catcher "finally2" $ actionFinally $ return () catcher "finally3" $ actionFinally $ liftIO $ sleep 10 catcher "finally4" $ actionFinally $ need ["wait"] "wait" ~> do liftIO $ sleep 10 catcher "exception1" $ actionOnException $ fail "die" catcher "exception2" $ actionOnException $ return () res <- newResource "resource_name" 1 "resource" %> \out -> withResource res 1 $ need ["resource-dep"] "overlap.txt" %> \out -> writeFile' out "overlap.txt" "overlap.t*" %> \out -> writeFile' out "overlap.t*" "overlap.*" %> \out -> writeFile' out "overlap.*" "chain.2" %> \out -> do src <- readFile' "chain.1" if src == "err" then error "err_chain" else writeFileChanged out src "chain.3" %> \out -> copyFile' "chain.2" out "tempfile" %> \out -> do file <- withTempFile $ \file -> do liftIO $ assertExists file return file liftIO $ assertMissing file withTempFile $ \file -> do liftIO $ assertExists file writeFile' out file fail "tempfile-died" "tempdir" %> \out -> do file <- withTempDir $ \dir -> do let file = dir "foo.txt" liftIO $ writeFile (dir "foo.txt") "" -- will throw if the directory does not exist writeFile' out "" return file liftIO $ assertMissing file phony "fail1" $ fail "die1" phony "fail2" $ fail "die2" when (Die `elem` args) $ action $ error "death error" "fresh_dir" %> \out -> liftIO $ createDirectoryRecursive out "need_dir" %> \out -> do liftIO $ createDirectoryRecursive "existing_dir" need ["existing_dir"] writeFile' out "" "persist_failure.1" %> \out -> do liftIO $ appendFile "persist_failure.log" "[pre]" need ["persist_failure.2"] liftIO $ appendFile "persist_failure.log" "[post]" writeFile' out "" "persist_failure.2" %> \out -> do src <- readFile' "persist_failure.3" liftIO $ print ("persist_failure.3", src) if src == "die" then do liftIO $ appendFile "persist_failure.log" "[err]" fail "die" else writeFileChanged out src "fast_failure" %> \out -> do liftIO $ sleep 0.1 fail "die" "slow_success" %> \out -> do liftIO $ sleep 20 writeFile' out "" addOracle $ \(BadBinary x) -> return $ BadBinary $ 'b':x "badinput" %> \out -> do askOracle $ BadBinary "bad" liftIO $ appendFile out "x" "badoutput" %> \out -> do askOracle $ BadBinary "ad" liftIO $ appendFile out "x" "badnone" %> \out -> do alwaysRerun liftIO $ appendFile out "x" -- not tested by default since only causes an error when idle GC is turned on phony "block" $ liftIO $ putStrLn $ let x = x in x test build = do let crash args parts = assertException parts (build $ "--quiet" : args) build ["clean","--sleep"] writeFile "chain.1" "x" build ["chain.3","--sleep"] writeFile "chain.1" "err" crash ["chain.3"] ["err_chain"] crash ["norule"] ["norule_isavailable"] crash ["failcreate"] ["failcreate"] crash ["failcreates"] ["failcreates"] crash ["recursive_"] ["recursive_","intermediate_","recursive"] crash ["rec1","rec2"] ["rec1","rec2","recursive"] crash ["systemcmd"] ["systemcmd","random_missing_command"] crash ["stack1"] ["stack1","stack2","stack3","crash"] b <- IO.doesFileExist "staunch1" when b $ removeFile "staunch1" crash ["staunch1","staunch2","-j2"] ["crash"] assertBoolIO (not <$> IO.doesFileExist "staunch1") "File should not exist, should have crashed first" crash ["staunch1","staunch2","-j2","--keep-going","--silent"] ["crash"] assertBoolIO (IO.doesFileExist "staunch1") "File should exist, staunch should have let it be created" crash ["finally1"] ["die"] assertContents "finally1" "1" build ["finally2"] assertContents "finally2" "1" crash ["exception1"] ["die"] assertContents "exception1" "1" build ["exception2"] assertContents "exception2" "0" forM_ ["finally3","finally4"] $ \name -> do t <- forkIO $ ignore $ build [name,"--exception"] retry 10 $ sleep 0.1 >> assertContents name "0" throwTo t (IndexOutOfBounds "test") retry 10 $ sleep 0.1 >> assertContents name "1" crash ["resource"] ["cannot currently call apply","withResource","resource_name"] build ["overlap.foo"] assertContents "overlap.foo" "overlap.*" build ["overlap.txt"] assertContents "overlap.txt" "overlap.txt" crash ["overlap.txx"] ["key matches multiple rules","2","overlap.txx"] crash ["tempfile"] ["tempfile-died"] src <- readFile "tempfile" assertMissing src build ["tempdir"] crash ["--die"] ["Shake","action","death error"] putStrLn "## BUILD errors" (out,_) <- IO.captureOutput $ build [] assertBool ("nothing to do" `isInfixOf` out) $ "Expected 'nothing to do', but got: " ++ out putStrLn "## BUILD errors fail1 fail2 -k -j2" (out,_) <- IO.captureOutput $ try_ $ build ["fail1","fail2","-k","-j2",""] assertBool ("die1" `isInfixOf` out && "die2" `isInfixOf` out) $ "Expected 'die1' and 'die2', but got: " ++ out crash ["fresh_dir"] ["expected a file, got a directory","fresh_dir"] crash ["need_dir"] ["expected a file, got a directory","existing_dir"] -- check errors don't persist to the database, #428 writeFile "persist_failure.log" "" writeFile "persist_failure.3" "test" build ["persist_failure.1","--sleep"] writeFile "persist_failure.3" "die" crash ["persist_failure.1","--sleep"] [] assertContents "persist_failure.log" "[pre][post][err][pre]" writeFile "persist_failure.3" "test" build ["persist_failure.1","--sleep"] assertContents "persist_failure.log" "[pre][post][err][pre]" writeFile "persist_failure.3" "more" build ["persist_failure.1"] assertContents "persist_failure.log" "[pre][post][err][pre][pre][post]" -- check a fast failure aborts a slow success (t, _) <- duration $ crash ["fast_failure","slow_success","-j2"] ["die"] assertBool (t < 10) $ "Took too long, expected < 10, got " ++ show t -- for exceptions on Key we die while reading the database, and restart from scratch build ["badinput"] build ["badinput","--silent"] assertContents "badinput" "xx" build ["badnone","--silent"] -- must be able to still run other rules assertContents "badnone" "x" -- for exceptions on Value we die while running the rule that requires it build ["badoutput"] crash ["badoutput"] ["badoutput","BadBinary"] build ["badnone"] -- must be able to still run other rules assertContents "badnone" "xx" shake-0.16.4/src/Test/Docs.hs0000644000000000000000000003546613261223301014024 0ustar0000000000000000{-# LANGUAGE PatternGuards, ViewPatterns #-} module Test.Docs(main) where import Development.Shake import Development.Shake.FilePath import System.Directory import Test.Type import Control.Monad import Data.Char import General.Extra import Data.List.Extra import Data.Maybe import System.Info import Data.Version.Extra -- Older versions of Haddock garbage the --@ markup and have ambiguity errors brokenHaddock = compilerVersion < makeVersion [8] main = shakeTest_ (unless brokenHaddock . noTest) $ do let index = "dist/doc/html/shake/index.html" let config = "dist/setup-config" want ["Success.txt"] let needSource = need =<< getDirectoryFiles "." (map (root ) ["src/Development/Shake.hs","src/Development/Shake//*.hs","src/Development/Ninja/*.hs","src/General//*.hs"]) config %> \_ -> do need $ map (root ) ["shake.cabal","Setup.hs"] -- Make Cabal and Stack play nicely path <- getEnv "GHC_PACKAGE_PATH" liftIO $ createDirectoryRecursive "dist" dist <- liftIO $ canonicalizePath "dist" -- make sure it works even if we cwd cmd_ (RemEnv "GHC_PACKAGE_PATH") (Cwd root) "runhaskell Setup.hs configure" ["--builddir=" ++ dist,"--user"] -- package-db is very sensitive, see #267 -- note that the reverse ensures the behaviour is consistent between the flags and the env variable ["--package-db=" ++ x | x <- maybe [] (reverse . filter (`notElem` [".",""]) . splitSearchPath) path] -- Paths_shake is only created by "Setup build" (which we want to skip), and required by "Setup haddock", so we fake it copyFile' (root "src/Paths.hs") "dist/build/autogen/Paths_shake.hs" copyFile' (root "src/Paths.hs") "dist/build/shake/autogen/Paths_shake.hs" writeFile' "dist/build/autogen/cabal_macros.h" "" writeFile' "dist/build/shake/autogen/cabal_macros.h" "" trackAllow ["dist//*"] index %> \_ -> do need $ config : map (root ) ["shake.cabal","Setup.hs","README.md","CHANGES.txt"] needSource trackAllow ["dist//*"] dist <- liftIO $ canonicalizePath "dist" cmd (Cwd root) "runhaskell Setup.hs haddock" ["--builddir=" ++ dist] "Part_*.hs" %> \out -> do need [root "src/Test/Docs.hs"] -- so much of the generator is in this module let noR = filter (/= '\r') src <- if "_md" `isSuffixOf` takeBaseName out then fmap (findCodeMarkdown . lines . noR) $ readFile' $ root "docs/" ++ drop 5 (reverse (drop 3 $ reverse $ takeBaseName out)) ++ ".md" else fmap (findCodeHaddock . noR) $ readFile' $ "dist/doc/html/shake/" ++ replace "_" "-" (drop 5 $ takeBaseName out) ++ ".html" let (imports,rest) = partition ("import " `isPrefixOf`) $ showCode src writeFileChanged out $ unlines $ ["{-# LANGUAGE DeriveDataTypeable, RankNTypes, ExtendedDefaultRules, GeneralizedNewtypeDeriving #-}" ,"{-# LANGUAGE NoMonomorphismRestriction, ScopedTypeVariables, ConstraintKinds, FlexibleContexts, TypeFamilies #-}" ,"{-# OPTIONS_GHC -w #-}" ,"module " ++ takeBaseName out ++ "() where" ,"import Control.Applicative" ,"import Control.Concurrent" ,"import Control.Exception" ,"import Control.Monad" ,"import Data.ByteString(ByteString)" ,"import Data.Char" ,"import Data.Data" ,"import Data.Dynamic" ,"import Data.List.Extra" ,"import System.Time.Extra" ,"import Data.Maybe" ,"import Data.Monoid" ,"import Development.Shake hiding ((*>),trackAllow)" ,"import Development.Shake.Classes" ,"import Development.Shake.Rule hiding (trackAllow)" ,"import Development.Shake.Util" ,"import Development.Shake.FilePath" ,"import System.Console.GetOpt" ,"import System.Directory(setCurrentDirectory)" ,"import qualified System.Directory" ,"import System.Environment(lookupEnv, getEnvironment)" ,"import System.Process" ,"import System.Exit" ,"import Control.Applicative" ,"import Control.Monad.IO.Class" ,"import System.IO"] ++ ["import " ++ replace "_" "." (drop 5 $ takeBaseName out) | not $ "_md.hs" `isSuffixOf` out] ++ imports ++ ["(==>) :: Bool -> Bool -> Bool" ,"(==>) = undefined" ,"(<==) = ()" ,"infix 1 ==>" ,"infix 0 ===" ,"(===) :: a -> a -> b" ,"(===) = undefined" ,"forAll f = f undefined" ,"remaining = 1.1" ,"done = 1.1" ,"time_elapsed = 1.1" ,"old = \"\"" ,"new = \"\"" ,"myfile = \"\"" ,"inputs = [\"\"]" ,"files = [\"\"]" ,"input = \"\"" ,"output = \"\"" ,"opts = shakeOptions" ,"result = undefined :: IO (Maybe (Rules ()))" ,"launchMissiles = undefined :: Bool -> IO ()" ,"myVariable = ()" ,"instance Eq (OptDescr a)" ,"(foo,bar,baz) = undefined" ,"(p1,p2) = (0.0, 0.0)" ,"(r1,r2) = (return () :: Rules(), return () :: Rules())" ,"xs = []" ,"ys = []" ,"os = [\"file.o\"]" ,"out = \"\"" ,"str1 = \"\"" ,"str2 = \"\"" ,"def = undefined" ,"var = undefined" ,"str = \"\""] ++ rest "Files.lst" %> \out -> do need [root "src/Test/Docs.hs"] -- so much of the generator is in this module need [index] filesHs <- getDirectoryFiles "dist/doc/html/shake" ["Development-*.html"] filesMd <- getDirectoryFiles (root "docs") ["*.md"] writeFileChanged out $ unlines $ ["Part_" ++ replace "-" "_" (takeBaseName x) | x <- filesHs, not $ any (`isSuffixOf` x) ["-Classes.html", "-FilePath.html"]] ++ ["Part_" ++ takeBaseName x ++ "_md" | x <- filesMd, takeBaseName x `notElem` ["Developing","Model"]] let needModules = do mods <- readFileLines "Files.lst"; need [m <.> "hs" | m <- mods]; return mods "Main.hs" %> \out -> do mods <- needModules writeFileLines out $ ["module Main(main) where"] ++ ["import " ++ m | m <- mods] ++ ["main = return ()"] "Success.txt" %> \out -> do putNormal . ("Checking documentation for:\n" ++) =<< readFile' "Files.lst" needModules need ["Main.hs"] trackAllow ["dist//*"] needSource cmd_ "ghc -fno-code -ignore-package=hashmap " ["-idist/build/autogen","-i" ++ root "src","Main.hs"] writeFile' out "" --------------------------------------------------------------------- -- FIND THE CODE newtype Code = Code [String] deriving (Show,Eq,Ord) findCodeHaddock :: String -> [Code] findCodeHaddock src = [ Code $ unindent $ lines $ innerText x | tag <- ["code","pre"] , x <- insideTag tag src , let bad = nubOrd (insideTag "em" x) \\ italics , if null bad then True else error $ "Bad italics, " ++ show bad] findCodeMarkdown :: [String] -> [Code] findCodeMarkdown (x:xs) | indented x && not (isBlank x) = let (a,b) = span (\x -> indented x || isBlank x) (x:xs) in Code (dropWhileEnd isBlank $ unindent a) : findCodeMarkdown b where indented x = length (takeWhile isSpace x) >= 4 findCodeMarkdown (x:xs) = map (Code . return) (evens $ splitOn "`" x) ++ findCodeMarkdown xs where evens (x:y:xs) = y : evens xs evens _ = [] findCodeMarkdown [] = [] --------------------------------------------------------------------- -- RENDER THE CODE showCode :: [Code] -> [String] showCode = concat . zipWith f [1..] . nubOrd where f i (Code x) | "#" `isPrefixOf` concat x = [] | all whitelist x = [] | otherwise = showStmt i $ filter (not . isBlank . dropComment) $ fixCmd $ map undefDots x fixCmd :: [String] -> [String] fixCmd xs | all ("cmd_ " `isPrefixOf`) xs = xs ++ ["return () :: IO () "] | otherwise = map (replace "Stdout out" "Stdout (out :: String)" . replace "Stderr err" "Stderr (err :: String)") xs -- | Replace ... with undefined (don't use undefined with cmd; two ...'s should become one replacement) undefDots :: String -> String undefDots x | Just x <- stripSuffix "..." x, Just (x,_) <- stripInfix "..." x = x ++ new | otherwise = replace "..." new x where new = if words x `disjoint` ["cmd","cmd_","Development.Shake.cmd","Development.Shake.cmd_"] then "undefined" else "[\"\"]" showStmt :: Int -> [String] -> [String] showStmt i [] = [] showStmt i xs | isDecl $ unlines xs = map f xs where f x = if fst (word1 x) `elem` dupes then "_" ++ show i ++ "_" ++ x else x showStmt i (x:xs) | fst (word1 x) `elem` types = ["type Code_" ++ show i ++ " = " ++ x] showStmt i [x] | length (words x) <= 2 = ["code_" ++ show i ++ " = (" ++ x ++ ")"] -- deal with operators and sections showStmt i xs | all isPredicate xs, length xs > 1 = zipWith (\j x -> "code_" ++ show i ++ "_" ++ show j ++ " = " ++ x) [1..] xs showStmt i xs = ("code_" ++ show i ++ " = do") : map (" " ++) xs ++ [" undefined" | isBindStmt $ last xs] isPredicate :: String -> Bool isPredicate x = not $ disjoint (words x) ["==","?=="] isBindStmt :: String -> Bool isBindStmt x = "let " `isPrefixOf` x || " <- " `isInfixOf` x isDecl :: String -> Bool isDecl x | fst (word1 x) `elem` ["import","infix","instance","newtype"] = True isDecl (words -> name:"::":_) | all isAlphaNum name = True -- foo :: Type Signature isDecl x | "=" `elem` takeWhile (`notElem` ["let","where"]) (words $ takeWhile (/= '{') x) = True -- foo arg1 arg2 = an implementation isDecl _ = False --------------------------------------------------------------------- -- TEXT MANIPULATION -- | Is a string empty or whitespace isBlank :: String -> Bool isBlank = all isSpace -- | If all lines are indented by at least n spaces, then trim n spaces from each line unindent :: [String] -> [String] unindent xs = map (drop n) xs where n = minimum $ 1000 : map (length . takeWhile (== ' ')) (filter (not . isBlank) xs) -- | Remove line comments from the end of lines dropComment :: String -> String dropComment = fst . breakOn "--" -- | Find all pieces of text inside a given tag insideTag :: String -> String -> [String] insideTag tag = map (fst . breakOn ("")) . drop 1 . splitOn ("<" ++ tag ++ ">") -- | Given some HTML, find the raw text innerText :: String -> String innerText ('<':xs) = innerText $ drop 1 $ dropWhile (/= '>') xs innerText ('&':xs) | Just xs <- stripPrefix "quot;" xs = '\"' : innerText xs | Just xs <- stripPrefix "lt;" xs = '<' : innerText xs | Just xs <- stripPrefix "gt;" xs = '>' : innerText xs | Just xs <- stripPrefix "amp;" xs = '&' : innerText xs innerText (x:xs) = x : innerText xs innerText [] = [] --------------------------------------------------------------------- -- DATA SECTION -- | Only the following identifiers can appear in italic code blocks in Haddock -- (otherwise it's a common markup mistake) italics :: [String] italics = words "command-name file-name N" -- | Identifiers that indicate the fragment is a type types :: [String] types = words $ "MVar IO String FilePath Maybe [String] Char ExitCode Change " ++ "Action Resource Rebuild FilePattern Development.Shake.FilePattern " ++ "Lint Verbosity Rules CmdOption Int Double " ++ "NFData Binary Hashable Eq Typeable Show Applicative " ++ "CmdResult ByteString ProcessHandle Rule Monad Monoid Data TypeRep " ++ "BuiltinRun BuiltinLint" -- | Duplicated identifiers which require renaming dupes :: [String] dupes = words "main progressSimple rules" isFilePath :: String -> Bool isFilePath x = "C:\\" `isPrefixOf` x || (all validChar x && ("foo/" `isPrefixOf` x || takeExtension x `elem` exts)) where validChar x = isAlphaNum x || x `elem` "_./*" exts = words $ ".txt .hi .hs .o .exe .tar .cpp .cfg .dep .out .deps .m .h .c .html .zip " ++ ".js .json .trace .database .src .sh .bat .ninja .rot13 .version .digits .prof .md" isCmdFlag :: String -> Bool isCmdFlag "+RTS" = True isCmdFlag x = length a `elem` [1,2] && all (\x -> isAlphaNum x || x `elem` "-=/_[]") b where (a,b) = span (== '-') x isCmdFlags :: String -> Bool isCmdFlags = all (\x -> let y = fromMaybe x $ stripSuffix "," x in isCmdFlag y || isArg y) . words where isArg = all (\x -> isUpper x || x == '=') isEnvVar :: String -> Bool isEnvVar x | Just x <- stripPrefix "$" x = all validChar x | Just x <- stripPrefix "%" x, Just x <- stripSuffix "%" x = all validChar x | otherwise = False where validChar x = isAlpha x || x == '_' isProgram :: String -> Bool isProgram (words -> x:xs) = x `elem` programs && all (\x -> isCmdFlag x || isFilePath x || all isAlpha x || x == "&&") xs where programs = words "excel gcc cl make ghc ghci cabal distcc build tar git fsatrace ninja touch pwd runhaskell rot13 main shake stack rm cat sed sh apt-get build-multiple" -- | Should a fragment be whitelisted and not checked whitelist :: String -> Bool whitelist x | null x || isFilePath x || isCmdFlags x || isEnvVar x || isProgram x = True whitelist x | elem x $ words $ "newtype do a q m c x value key os contents clean _make " ++ ".. /. // \\ //* dir/*/* dir " ++ "ConstraintKinds TemplateHaskell GeneralizedNewtypeDeriving DeriveDataTypeable TypeFamilies SetConsoleTitle " ++ "Data.List System.Directory Development.Shake.FilePath run " ++ "NoProgress Error src about://tracing " ++ ".make/i586-linux-gcc/output build " ++ "/usr/special /usr/special/userbinary " ++ "Hidden extension xterm main opts result flagValues argValues " ++ "HEADERS_DIR /path/to/dir CFLAGS let linkFlags temp code out err " ++ "_shake _shake/build manual chrome://tracing/ compdb " ++ "docs/manual foo.* _build _build/run depfile 0.000s " ++ "@ndm_haskell file-name .PHONY filepath trim base stack extra #include " ++ "*> BuiltinRun BuiltinLint RuleResult" = True whitelist x = x `elem` ["[Foo.hi, Foo.o]" ,"shake-progress" ,"type instance" ,"1m25s (15%)" ,"3m12s (82%)" ,"getPkgVersion $ GhcPkgVersion \"shake\"" ,"ghc --make MyBuildSystem -threaded -rtsopts \"-with-rtsopts=-I0 -qg -qb\"" ,"# command-name (for file-name)" ,"build rules" ,"actions" ,"shakeFiles=\"_build\"" ,"#include \"" ,"pattern %> actions = (pattern ?==) ?> actions" -- because it overlaps ,"buildDir = \"_build\"" ,"#!/bin/sh" ,"shake-build-system" ,"\"_build\" x -<.> \"o\"" ,"[item1,item2,item2]" ,"$(LitE . StringL . loc_filename <$> location)" ,"-d[ FILE], --debug[=FILE]" ,"-r[ FILE], --report[=FILE], --profile[=FILE]" ] shake-0.16.4/src/Test/Directory.hs0000644000000000000000000001111513261223301015061 0ustar0000000000000000 module Test.Directory(main) where import Development.Shake import Development.Shake.FilePath import Test.Type import Data.List import Data.Function import Control.Monad import General.Extra import System.Directory(getCurrentDirectory, createDirectory) import qualified System.Directory as IO import qualified System.IO.Extra as IO -- Use escape characters, _o=* _l=/ __= readEsc ('_':'o':xs) = '*' : readEsc xs readEsc ('_':'l':xs) = '/' : readEsc xs readEsc ('_':'_':xs) = ' ' : readEsc xs readEsc (x:xs) = x : readEsc xs readEsc [] = [] showEsc = concatMap f where f '*' = "_o" f '/' = "_l" f ' ' = "__" f x = [x] main = shakeTest_ test $ do "*.contents" %> \out -> writeFileLines out =<< getDirectoryContents (readEsc $ dropExtension out) "*.dirs" %> \out -> writeFileLines out =<< getDirectoryDirs (readEsc $ dropExtension out) "*.files" %> \out -> do let pats = readEsc $ dropExtension out let (x:xs) = ["" | " " `isPrefixOf` pats] ++ words pats writeFileLines out . map toStandard =<< getDirectoryFiles x xs "*.exist" %> \out -> do let xs = words $ readEsc $ dropExtension out fs <- mapM doesFileExist xs ds <- mapM doesDirectoryExist xs let bool x = if x then "1" else "0" writeFileLines out $ zipWith ((++) `on` bool) fs ds "dots" %> \out -> do cwd <- liftIO getCurrentDirectory b1 <- liftM2 (==) (getDirectoryContents ".") (getDirectoryContents "") b2 <- liftM2 (==) (getDirectoryDirs ".") (getDirectoryDirs "") b3 <- liftM2 (==) (getDirectoryFiles "." ["*.txt"]) (getDirectoryFiles "" ["*.txt"]) b4 <- liftM2 (==) (getDirectoryFiles "." ["C.txt/*.txt"]) (getDirectoryFiles "" ["C.txt/*.txt"]) b5 <- liftM2 (==) (getDirectoryFiles "." ["//*.txt"]) (getDirectoryFiles "" ["//*.txt"]) writeFileLines out $ map show [b1,b2,b3,b4,b5] test build = do let demand x ys = let f = showEsc x in do build [f]; assertContents f $ unlines $ words ys build ["clean"] demand " *.txt.files" "" demand " //*.txt.files" "" demand ".dirs" "" demand "A.txt B.txt C.txt.exist" "00 00 00" writeFile "A.txt" "" writeFile "B.txt" "" createDirectory "C.txt" writeFile "C.txt/D.txt" "" writeFile "C.txt/E.xtx" "" demand " *.txt.files" "A.txt B.txt" demand ".dirs" "C.txt" demand "A.txt B.txt C.txt.exist" "10 10 01" demand " //*.txt.files" "A.txt B.txt C.txt/D.txt" demand "C.txt *.txt.files" "D.txt" demand " *.txt //*.xtx.files" "A.txt B.txt C.txt/E.xtx" demand " C.txt/*.files" "C.txt/D.txt C.txt/E.xtx" demand " missing_dir/*.files" "" demand " missing_dir/bar/*.files" "" demand " //missing_dir/*.files" "" assertException ["missing_dir","does not exist"] $ build ["--quiet",showEsc "missing_dir *.files"] build ["dots","--no-lint"] assertContents "dots" $ unlines $ words "True True True True True" let removeTest pat del keep = IO.withTempDir $ \dir -> do forM_ (del ++ keep) $ \s -> do createDirectoryRecursive $ dir takeDirectory s unless (hasTrailingPathSeparator s) $ writeFile (dir s) "" removeFiles dir pat createDirectoryRecursive dir forM_ (map ((,) False) del ++ map ((,) True) keep) $ \(b,s) -> do b2 <- (if hasTrailingPathSeparator s then IO.doesDirectoryExist else IO.doesFileExist) $ dir s when (b /= b2) $ do let f b = if b then "present" else "missing" error $ "removeFiles mismatch: with pattern " ++ show pat ++ ", " ++ s ++ " should be " ++ f b ++ " but is " ++ f b2 removeTest ["//bob"] ["test/bob","more/bob"] ["extra/obo"] removeTest ["bob"] ["bob/"] ["bar/"] removeTest ["*.hs"] ["test.hs"] ["extra/more.hs","new.txt"] removeTest ["baz"] ["baz"] ["foo","bar/bob"] removeTest ["baz"] ["baz/bob","baz/"] ["foo","bar/bob"] removeTest ["Foo//*"] ["Foo/bar","Foo/Quux/bar","Foo/Quux/"] [] removeTest ["Foo//"] ["Foo/"] ["bar"] removeTest ["baz"] [] ["test.hs","bar/","foo/"] removeTest ["bob//*"] [] ["test/bob/"] removeTest ["//bob"] ["test/bob/"] ["test/"] removeTest ["//*.txt"] ["more/a.txt"] ["more/"] removeTest ["//*.txt"] ["more/a.txt/"] ["more/"] removeTest ["//*.txt"] ["more/a.txt/","more/b.txt"] ["more/"] removeTest ["//*.txt"] [] ["more/"] removeTest ["a//b"] ["a/c/b"] [] removeFiles "non-existing-directory" ["*"] shake-0.16.4/src/Test/Digest.hs0000644000000000000000000000534013261223301014337 0ustar0000000000000000 module Test.Digest(main) where import Control.Monad import Development.Shake import Test.Type main = shakeTest_ test $ do want ["Out.txt","Out2.txt"] "Out.txt" %> \out -> do txt <- readFile' "In.txt" liftIO $ appendFile out txt ["Out1.txt","Out2.txt"] &%> \[out1,out2] -> do txt <- readFile' "In.txt" liftIO $ appendFile out1 txt liftIO $ appendFile out2 txt ["Bug1.txt","Bug2.txt"] &%> \[out1,out2] -> do need ["Bug3.txt"] writeFile' out1 "X" writeFile' out2 "Y" "leaf" ~> return () "node1.txt" %> \file -> do need ["leaf"]; writeFile' file "x" "node2.txt" %> \file -> do need ["node1.txt"]; liftIO $ appendFile file "x" ["rewrite1","rewrite2"] &%> \outs -> do alwaysRerun forM_ outs $ \out -> writeFile' out "rewrite" test build = do let outs = ["Out.txt","Out1.txt","Out2.txt"] let writeOut x = forM_ outs $ \out -> writeFile out x let writeIn = writeFile "In.txt" let assertOut x = forM_ outs $ \out -> assertContents out x writeOut "" writeIn "X" build ["--sleep","--digest-and"] assertOut "X" -- should not involve a hash calculation (sadly no way to test that) build ["--sleep","--digest-and"] assertOut "X" writeIn "X" build ["--sleep","--digest-and"] assertOut "X" writeIn "X" build ["--sleep","--digest-or"] assertOut "XX" writeIn "X" build ["--sleep","--digest-and"] assertOut "XX" build ["--sleep","--digest-and"] writeOut "XX" build ["--sleep","--digest-and"] assertOut "XX" build ["--sleep","--digest-and"] writeOut "Y" build ["--sleep","--digest-and"] assertOut "YX" writeIn "X" build ["--sleep","--digest"] assertOut "YX" writeIn "Z" build ["--sleep","--digest-and-input"] assertOut "YXZ" build ["--sleep","--digest-and-input"] writeOut "YXZ" build ["--sleep","--digest-and-input"] assertOut "YXZZ" writeIn "Q" build ["--sleep","--digest-and-input"] assertOut "YXZZQ" writeIn "Q" build ["--sleep","--digest-and-input"] assertOut "YXZZQ" -- test for #218 forM_ [("--digest",1),("--digest-and",1),("--digest-or",2),("--digest-and-input",2),("",2)] $ \(flag,count) -> do writeFile "node2.txt" "y" replicateM_ 2 $ build $ ["node2.txt","--sleep"] ++ [flag | flag /= ""] assertContents "node2.txt" $ 'y' : replicate count 'x' -- test for #296 writeFile "Bug3.txt" "X" build ["--digest-and-input","Bug1.txt","--sleep"] writeFile "Bug3.txt" "Y" build ["--digest-and-input","Bug1.txt","--lint"] -- test for #427 build ["rewrite1","--digest-and"] build ["rewrite1","--digest-and","--lint","--sleep"] shake-0.16.4/src/Test/Config.hs0000644000000000000000000000401013261223301014316 0ustar0000000000000000 module Test.Config(main) where import Development.Shake import Development.Shake.FilePath import Development.Shake.Config import Test.Type import Data.Char import qualified Data.HashMap.Strict as Map import Data.Maybe main = shakeTest_ test $ do want ["hsflags.var","cflags.var","none.var","keys"] usingConfigFile "config" "*.var" %> \out -> do cfg <- getConfig $ map toUpper $ takeBaseName out liftIO $ appendFile (out -<.> "times") "X" writeFile' out $ fromMaybe "" cfg "keys" %> \out -> do liftIO $ appendFile "keys.times" "X" liftIO . writeFile out . unwords =<< getConfigKeys test build = do build ["clean"] writeFile "config" $ unlines ["HEADERS_DIR = /path/to/dir" ,"CFLAGS = -O2 -I${HEADERS_DIR} -g" ,"HSFLAGS = -O2"] build [] assertContents "cflags.var" "-O2 -I/path/to/dir -g" assertContents "hsflags.var" "-O2" assertContents "none.var" "" assertContents "keys" "CFLAGS HEADERS_DIR HSFLAGS" appendFile "config" $ unlines ["CFLAGS = $CFLAGS -w"] build [] assertContents "cflags.var" "-O2 -I/path/to/dir -g -w" assertContents "hsflags.var" "-O2" assertContents "cflags.times" "XX" assertContents "hsflags.times" "X" assertContents "keys.times" "X" -- Test readConfigFileWithEnv writeFile "config" $ unlines ["HEADERS_DIR = ${SOURCE_DIR}/path/to/dir" ,"CFLAGS = -O2 -I${HEADERS_DIR} -g"] vars <- readConfigFileWithEnv [("SOURCE_DIR", "/path/to/src")] "config" assertBool (Map.lookup "HEADERS_DIR" vars == Just "/path/to/src/path/to/dir") $ "readConfigFileWithEnv:" ++ " Expected: " ++ show (Just "/path/to/src/path/to/dir") ++ " Got: " ++ show (Map.lookup "HEADERS_DIR" vars) assertBool (Map.lookup "CFLAGS" vars == Just "-O2 -I/path/to/src/path/to/dir -g") $ "readConfigFileWithEnv:" ++ " Expected: " ++ show (Just "-O2 -I/path/to/src/path/to/dir -g") ++ " Got: " ++ show (Map.lookup "CFLAGS" vars) shake-0.16.4/src/Test/Command.hs0000644000000000000000000001635313261223301014504 0ustar0000000000000000{-# LANGUAGE Rank2Types, ScopedTypeVariables #-} module Test.Command(main) where import Control.Applicative import Development.Shake import Development.Shake.FilePath import Control.Exception.Extra import System.Time.Extra import Control.Monad.Extra import System.Directory import Test.Type import System.Exit import System.Process import Data.Tuple.Extra import Data.List.Extra import Control.Monad.IO.Class import System.Info.Extra import qualified Data.ByteString.Char8 as BS import qualified Data.ByteString.Lazy.Char8 as LBS import Prelude main = shakeTest_ test $ do -- shake_helper must be in a subdirectory so we can test placing that subdir on the $PATH let helper = toNative $ "helper/shake_helper" <.> exe let name !> test = do want [name] name ~> do need ["helper/shake_helper" <.> exe]; test let helper_source = unlines ["import Control.Concurrent" ,"import Control.Monad" ,"import System.Directory" ,"import System.Environment" ,"import System.Exit" ,"import System.IO" ,"import qualified Data.ByteString.Lazy.Char8 as LBS" ,"main = do" ," args <- getArgs" ," forM_ args $ \\(a:rg) -> do" ," case a of" ," 'o' -> putStrLn rg" ," 'e' -> hPutStrLn stderr rg" ," 'x' -> exitFailure" ," 'c' -> putStrLn =<< getCurrentDirectory" ," 'v' -> putStrLn =<< getEnv rg" ," 'w' -> threadDelay $ floor $ 1000000 * (read rg :: Double)" ," 'r' -> LBS.putStr $ LBS.replicate (read rg) 'x'" ," 'i' -> putStr =<< getContents" ," hFlush stdout" ," hFlush stderr" ] "shake_helper.hs" %> \out -> do need ["../../src/Test/Command.hs"] writeFileChanged out helper_source ["helper/shake_helper" <.> exe, "shake_helper.o", "shake_helper.hi"] &%> \_ -> do need ["shake_helper.hs"] cmd "ghc --make" "shake_helper.hs -o helper/shake_helper" "capture" !> do (Stderr err, Stdout out) <- cmd helper ["ostuff goes here","eother stuff here"] liftIO $ out === "stuff goes here\n" liftIO $ err === "other stuff here\n" liftIO $ waits $ \w -> do Stdouterr out <- cmd helper Shell ["o1",w,"e2",w,"o3"] out === "1\n2\n3\n" "failure" !> do (Exit e, Stdout (), Stderr ()) <- cmd helper "oo ee x" when (e == ExitSuccess) $ error "/= ExitSuccess" liftIO $ assertException ["BAD"] $ cmd helper "oo eBAD x" (EchoStdout False) (EchoStderr False) liftIO $ assertException ["MORE"] $ cmd helper "oMORE eBAD x" (WithStdout True) (WithStderr False) (EchoStdout False) (EchoStderr False) "cwd" !> do -- FIXME: Linux searches the Cwd argument for the file, Windows searches getCurrentDirectory helper <- liftIO $ canonicalizePath $ "helper/shake_helper" <.> exe Stdout out <- cmd (Cwd "helper") helper "c" let norm = fmap dropTrailingPathSeparator . canonicalizePath . trim liftIO $ join $ liftM2 (===) (norm out) (norm "helper") let checkTimeout act = do offset <- liftIO offsetTime act t <- liftIO offset putNormal $ "Timed out in " ++ showDuration t when (t < 2 || t > 8) $ error $ "failed to timeout, took " ++ show t "timeout1" !> checkTimeout (do Exit exit <- cmd (Timeout 2) helper "w20" liftIO $ assertBool (exit /= ExitSuccess) "exit was ExitSuccess") "timeout2" !> do checkTimeout $ liftIO $ timeout 2 $ cmd_ helper "w20" when False $ "timeout3" !> do checkTimeout $ liftIO $ timeout 2 $ cmd_ Shell helper "w20" "env" !> do -- use liftIO since it blows away PATH which makes lint-tracker stop working Stdout out <- liftIO $ cmd (Env [("FOO","HELLO SHAKE")]) Shell helper "vFOO" liftIO $ out === "HELLO SHAKE\n" Stdout out <- cmd (AddEnv "FOO" "GOODBYE SHAKE") Shell helper "vFOO" liftIO $ out === "GOODBYE SHAKE\n" "space" !> do Stdout out <- cmd helper ["oSPACE 1"] liftIO $ out === "SPACE 1\n" Stdout out <- cmd Shell helper "\"oSPACE 2\"" liftIO $ out === "SPACE 2\n" whenM (liftIO hasTracker) $ do Stdout out <- cmd Shell AutoDeps helper "\"oSPACE 2\"" liftIO $ out === "SPACE 2\n" (Stdout (), CmdLine x) <- cmd helper ["oSPACE 3","oDIRECT"] unless (" \"oSPACE 3\" oDIRECT" `isSuffixOf` replace "\'" "\"" x) $ fail $ "Invalid CmdLine, " ++ x "path" !> do let path = AddPath [dropTrailingPathSeparator "helper"] [] cmd_ "helper/shake_helper" cmd_ $ "helper/shake_helper" <.> exe cmd_ path Shell "shake_helper" cmd_ path "shake_helper" "file" !> do let file = "file.txt" cmd_ helper (FileStdout file) (FileStderr file) (EchoStdout False) (EchoStderr False) (WithStderr False) "ofoo ebar obaz" liftIO $ assertContents file "foo\nbar\nbaz\n" liftIO $ waits $ \w -> do Stderr err <- cmd helper (FileStdout file) (FileStderr file) ["ofoo",w,"ebar",w,"obaz"] err === "bar\n" assertContents file "foo\nbar\nbaz\n" "timer" !> do timer $ cmd helper "binary" !> do (Stdout str, Stdout bs) <- cmd BinaryPipes helper "ofoo" liftIO $ (===) (str, bs) $ second BS.pack $ dupe $ if isWindows then "foo\r\n" else "foo\n" (Stdout str, Stdout bs) <- cmd helper "ofoo" liftIO $ (str, bs) === ("foo\n", BS.pack $ if isWindows then "foo\r\n" else "foo\n") return () "large" !> do (Stdout (_ :: String), CmdTime t1) <- cmd helper "r10000000" (Stdout (_ :: LBS.ByteString), CmdTime t2) <- cmd helper "r10000000" t3 <- withTempFile $ \file -> fromCmdTime <$> cmd helper "r10000000" (FileStdout file) liftIO $ putStrLn $ "Capturing 10Mb takes: " ++ intercalate "," [s ++ " = " ++ showDuration d | (s,d) <- [("String",t1),("ByteString",t2),("File",t3)]] "stdin" !> do withTempFile $ \file -> do liftIO $ writeFile file " " Stdout (x :: String) <- cmd helper "i" (Stdin "hello") (FileStdin file) (StdinBS $ LBS.pack "world") liftIO $ x === "hello world" "async" !> do let file = "async.txt" pid <- cmd helper (FileStdout file) "w2" "ohello" Nothing <- liftIO $ getProcessExitCode pid ExitSuccess <- liftIO $ waitForProcess pid liftIO $ assertContents file "hello\n" test build = do -- reduce the overhead by running all the tests in parallel -- lint can make a big different to the command lines, so test with and without whenM hasTracker $ build ["-j4","--no-lint"] build ["-j4"] timer :: (CmdResult r, MonadIO m) => (forall r . CmdResult r => m r) -> m r timer act = do (CmdTime t, CmdLine x, r) <- act liftIO $ putStrLn $ "Command " ++ x ++ " took " ++ show t ++ " seconds" return r waits :: (String -> IO ()) -> IO () waits op = f 0 where f w | w > 1 = op "w10" | otherwise = catch_ (op $ "w" ++ show w) $ const $ f $ w + 0.1 shake-0.16.4/src/Test/Cache.hs0000644000000000000000000000200013261223301014111 0ustar0000000000000000 module Test.Cache(main) where import Development.Shake import Development.Shake.FilePath import Data.Char import Test.Type main = shakeTest_ test $ do vowels <- newCache $ \file -> do src <- readFile' file liftIO $ appendFile "trace.txt" "1" return $ length $ filter isDigit src "*.out*" %> \x -> writeFile' x . show =<< vowels (dropExtension x <.> "txt") test build = do writeFile "trace.txt" "" writeFile "vowels.txt" "abc123a" build ["vowels.out1","vowels.out2","-j3","--sleep"] assertContents "trace.txt" "1" assertContents "vowels.out1" "3" assertContents "vowels.out2" "3" build ["vowels.out2","-j3"] assertContents "trace.txt" "1" assertContents "vowels.out1" "3" writeFile "vowels.txt" "12xyz34" build ["vowels.out2","-j3","--sleep"] assertContents "trace.txt" "11" assertContents "vowels.out2" "4" build ["vowels.out1","-j3","--sleep"] assertContents "trace.txt" "111" assertContents "vowels.out1" "4" shake-0.16.4/src/Test/C.hs0000644000000000000000000000126213261223301013301 0ustar0000000000000000 module Test.C(main) where import Development.Shake import Development.Shake.FilePath import Test.Type main = shakeTest_ noTest $ do let src = root "src/Test/C" want ["Main.exe"] "Main.exe" %> \out -> do cs <- getDirectoryFiles src ["*.c"] let os = map (<.> "o") cs need os cmd "gcc -o" [out] os "*.c.o" %> \out -> do let c = src takeBaseName out need [c] headers <- cIncludes c need $ map (() src . takeFileName) headers cmd "gcc -o" [out] "-c" [c] cIncludes :: FilePath -> Action [FilePath] cIncludes x = do Stdout stdout <- cmd "gcc" ["-MM",x] return $ drop 2 $ words stdout shake-0.16.4/src/Test/Benchmark.hs0000644000000000000000000000172313261223301015013 0ustar0000000000000000 module Test.Benchmark(main) where import General.GetOpt import Development.Shake import Test.Type import Text.Read.Extra import Development.Shake.FilePath data Opts = Depth Int | Breadth Int opts = [Option "" ["depth" ] (ReqArg (fmap Depth . readEither) "INT") "" ,Option "" ["breadth"] (ReqArg (fmap Breadth . readEither) "INT") ""] -- | Given a breadth and depth come up with a set of build files main = shakeTest test opts $ \opts -> do let depth = last $ error "Missing --depth" : [x | Depth x <- opts] let breadth = last $ error "Missing --breadth" : [x | Breadth x <- opts] want ["0." ++ show i | i <- [1..breadth]] "*" %> \out -> do let d = read $ takeBaseName out need [show (d + 1) ++ "." ++ show i | d < depth, i <- [1..breadth]] writeFile' out "" test build = do -- these help to test the stack limit build ["clean"] build ["--breadth=75","--depth=75"] build ["--breadth=75","--depth=75"] shake-0.16.4/src/Test/Batch.hs0000644000000000000000000000546513261223301014151 0ustar0000000000000000 module Test.Batch(main) where import Development.Shake import Development.Shake.FilePath import System.Directory import General.Extra import Test.Type import Control.Monad main = shakeTest test [] $ \opts -> do let inp x = x -<.> "in" file <- newResource "log.txt" 1 batch 3 ("*.out" %>) (\out -> do need [inp out]; return out) $ \outs -> do liftIO $ assertBool (length outs <= 3) "length outs <= 3" withResource file 1 $ liftIO $ appendFile "log.txt" $ show (length outs) ++ "\n" putNormal $ "Building batch: " ++ unwords outs forM_ outs $ \out -> liftIO $ copyFile (inp out) out want [show i <.> "out" | i <- [1..6]] "ABn.txt" %> \out -> do xs <- needHasChanged ["An.txt", "Bn.txt"] writeFileLines out xs ["An", "Bn"] &?%> \outs -> do xs <- needHasChanged $ map (-<.> "in") outs os <- mapM resultHasChanged outs forM_ (zip outs os) $ \(out, o) -> when (o || (out -<.> "in" `elem` xs)) $ writeFile' out "1" "On" %> \out -> do xs <- needHasChanged ["An", "Bn"] o <- resultHasChanged out writeFileLines out $ xs ++ ["On" | o] test build = do forM_ [1..6] $ \i -> writeFile (show i <.> "in") $ show i build ["--sleep","-j2"] assertBoolIO (do src <- readFile "log.txt"; return $ length (lines src) < 6) "some batching" writeFile "log.txt" "" writeFile "2.in" "22" writeFile "5.in" "55" build [] assertContents "log.txt" "2\n" writeFile "An.txt" "1" writeFile "Bn.txt" "1" build ["ABn.txt", "--sleep"] assertContents "ABn.txt" "An.txt\nBn.txt\n" writeFile "An.txt" "1" build ["ABn.txt", "--sleep"] assertContents "ABn.txt" "An.txt\n" writeFile "Bn.txt" "1" build ["ABn.txt", "--sleep"] assertContents "ABn.txt" "Bn.txt\n" build ["ABn.txt", "--sleep"] assertContents "ABn.txt" "Bn.txt\n" writeFile "ABn.txt" "bogus" build ["ABn.txt", "--sleep"] assertContents "ABn.txt" "" forM_ [[],["--usepredicate"]] $ \args -> do writeFile "An.in" "1" writeFile "Bn.in" "1" removeFile_ "On" build $ ["On", "--sleep"] ++ args assertContents "On" "An\nBn\nOn\n" writeFile "An.in" "1" build $ ["On", "--sleep"] ++ args assertContents "On" "An\n" writeFile "Bn.in" "1" build $ ["On", "--sleep"] ++ args assertContents "On" "Bn\n" build $ ["On", "--sleep"] ++ args assertContents "On" "Bn\n" removeFile "An" build $ ["On", "--sleep"] ++ args assertContents "On" "An\n" removeFile "An" writeFile "Bn.in" "2" build $ ["On", "--sleep"] ++ args assertContents "On" "An\nBn\n" removeFile "On" build $ ["On", "--sleep"] ++ args assertContents "On" "On\n" shake-0.16.4/src/Test/Basic.hs0000644000000000000000000001377113261223301014150 0ustar0000000000000000 module Test.Basic(main) where import Development.Shake import System.FilePath import Test.Type import System.Directory as IO import Data.List import Data.Maybe import Control.Monad import General.Extra import Data.Functor import Prelude main = shakeTest_ test $ do "AB.txt" %> \out -> do need ["A.txt", "B.txt"] text1 <- readFile' "A.txt" text2 <- readFile' "B.txt" writeFile' out $ text1 ++ text2 "twice.txt" %> \out -> do let src = "once.txt" need [src, src] copyFile' src out "once.txt" %> \out -> do src <- readFile' "zero.txt" writeFile' out src phonys $ \x -> if x /= "halfclean" then Nothing else Just $ removeFilesAfter "dir" ["//*e.txt"] phony "cleaner" $ removeFilesAfter "dir" ["//*"] phony "configure" $ liftIO $ appendFile "configure" "1" phony "install" $ do need ["configure","once.txt"] liftIO $ appendFile "install" "1" phony "duplicate1" $ need ["duplicate2","duplicate3"] phony "duplicate2" $ need ["duplicate3"] phony "duplicate3" $ liftIO $ appendFile "duplicate" "1" phony "dummy" $ liftIO $ appendFile "dummy" "1" phony "threads" $ do x <- getShakeOptions writeFile' "threads.txt" $ show $ shakeThreads x phony ("slash" "platform") $ return () phony "slash/forward" $ return () "dummer.txt" %> \out -> do need ["dummy","dummy"] need ["dummy"] liftIO $ appendFile out "1" r <- newResource ".log file" 1 let trace x = withResource r 1 $ liftIO $ appendFile ".log" x "*.par" %> \out -> do trace "[" (if "unsafe" `isInfixOf` out then unsafeExtraThread else id) $ liftIO $ sleep 0.1 trace "]" writeFile' out out "sep" "1.txt" %> \out -> writeFile' out "" "sep/2.txt" %> \out -> writeFile' out "" ["sep" "3.txt", "sep" "4.txt", "sep" "5.*", "sep/6.txt"] |%> \out -> writeFile' out "" ["sep" "7.txt"] |%> \out -> writeFile' out "" "ids/source" %> \out -> return () "ids/out" %> \out -> do need =<< readFileLines "ids/source"; writeFile' out "" "ids/*" %> \out -> do alwaysRerun; trace (takeFileName out); writeFile' out $ takeFileName out "rerun" %> \out -> do alwaysRerun; liftIO $ appendFile out "." phony "foo" $ liftIO $ createDirectoryRecursive "foo" phony "ordering2" $ liftIO $ appendFile "order.log" "X" phony "ordering" $ do liftIO $ appendFile "order.log" "Y" need ["ordering2"] test build = do build ["clean"] writeFile "A.txt" "AAA" writeFile "B.txt" "BBB" build ["AB.txt","--sleep"] assertContents "AB.txt" "AAABBB" appendFile "A.txt" "aaa" build ["AB.txt"] assertContents "AB.txt" "AAAaaaBBB" removeFile "AB.txt" build ["AB.txt"] assertContents "AB.txt" "AAAaaaBBB" writeFile "zero.txt" "xxx" build ["twice.txt","--sleep"] assertContents "twice.txt" "xxx" writeFile "zero.txt" "yyy" build ["once.txt","--sleep"] assertContents "twice.txt" "xxx" assertContents "once.txt" "yyy" writeFile "zero.txt" "zzz" build ["once.txt","twice.txt","--sleep"] assertContents "twice.txt" "zzz" assertContents "once.txt" "zzz" removeFile "twice.txt" build ["twice.txt"] assertContents "twice.txt" "zzz" show shakeOptions === show shakeOptions createDirectoryRecursive "dir" writeFile "dir/ae.txt" "" writeFile "dir/ea.txt" "" build ["halfclean"] assertBoolIO (IO.doesDirectoryExist "dir") "Directory should exist, cleaner should not have removed it" build ["cleaner"] sleep 1 -- sometimes takes a while for the file system to notice assertBoolIO (not <$> IO.doesDirectoryExist "dir") "Directory should not exist, cleaner should have removed it" writeFile "zero.txt" "" writeFile "configure" "" writeFile "install" "" build ["configure"] build ["install"] build ["install"] assertContents "configure" "111" assertContents "install" "11" build ["dummy"] assertContents "dummy" "1" build ["dummy"] assertContents "dummy" "11" build ["dummy","dummy"] assertContents "dummy" "111" writeFile "dummer.txt" "" build ["dummer.txt"] assertContents "dummer.txt" "1" build ["dummer.txt"] assertContents "dummer.txt" "11" build ["1.par","2.par","-j1"] assertContents ".log" "[][]" writeFile ".log" "" build ["3.par","4.par","-j2"] assertContents ".log" "[[]]" writeFile ".log" "" processors <- getProcessorCount putStrLn $ "getProcessorCount returned " ++ show processors when (processors > 1) $ do build ["5.par","6.par","-j0"] assertContents ".log" "[[]]" writeFile ".log" "" build ["unsafe1.par","unsafe2.par","-j2"] assertContents ".log" "[[]]" build ["threads","-j3"] assertContents "threads.txt" "3" build ["threads","-j0"] assertContents "threads.txt" (show processors) writeFile "duplicate" "" build ["duplicate1","duplicate3"] assertContents "duplicate" "1" build $ concat [["sep/" ++ show i ++ ".txt", "sep" show i ++ ".txt"] | i <- [1..7]] build ["slash" "platform","slash" "forward"] build ["slash/platform","slash/forward"] createDirectoryRecursive "ids" writeFile "ids/source" "ids/a" build ["ids/out","--sleep"] writeFile ".log" "" writeFile "ids/source" "ids/b" build ["ids/out","-j4"] -- if you collapse depends to [Id] then this ends up asking for the stale 'a' assertContents ".log" "b" writeFile "rerun" "" build ["rerun"] assertContents "rerun" "." build ["rerun","rerun"] assertContents "rerun" ".." build ["foo"] build ["foo"] build [] -- should say "no want/action statements, nothing to do" (checked manually) -- #523, #524 - phony children should not run first writeFile "order.log" "" build ["ordering"] assertContents "order.log" "YX" build ["ordering"] assertContents "order.log" "YXYX" shake-0.16.4/src/Test/Tup/0000755000000000000000000000000013261223302013333 5ustar0000000000000000shake-0.16.4/src/Test/Tup/root.cfg0000644000000000000000000000011413261223302014773 0ustar0000000000000000 hello.exe = hello.c newmath.a include ../../src/Test/Tup/newmath/root.cfg shake-0.16.4/src/Test/Tup/hello.c0000644000000000000000000000023113261223302014576 0ustar0000000000000000#include #include "square.h" int main(void) { printf("Hi, everybody!\n"); printf("Five squared is: %i\n", square(5)); return 0; } shake-0.16.4/src/Test/Tup/newmath/0000755000000000000000000000000013261223302014776 5ustar0000000000000000shake-0.16.4/src/Test/Tup/newmath/square.h0000644000000000000000000000002313261223302016442 0ustar0000000000000000int square(int x); shake-0.16.4/src/Test/Tup/newmath/square.c0000644000000000000000000000007513261223302016444 0ustar0000000000000000#include "square.h" int square(int x) { return x * x; } shake-0.16.4/src/Test/Tup/newmath/root.cfg0000644000000000000000000000002613261223302016440 0ustar0000000000000000 newmath.a = square.c shake-0.16.4/src/Test/Tar/0000755000000000000000000000000013261223302013311 5ustar0000000000000000shake-0.16.4/src/Test/Tar/list.txt0000644000000000000000000000006113261223302015022 0ustar0000000000000000src/Test/Tar.hs src/Run.hs src/Test/Tar/list.txt shake-0.16.4/src/Test/Progress/0000755000000000000000000000000013261223302014367 5ustar0000000000000000shake-0.16.4/src/Test/Progress/self-zero-j2.prog0000644000000000000000000003006413261223302017502 0ustar0000000000000000(1.020102,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 254, countUnknown = 4, countTodo = 148, timeSkipped = 0.0, timeBuilt = 0.7620761110447347, timeUnknown = 2.0004000980407e-3, timeTodo = (146.13152742385864,0)}) (3.7443745,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 258, countUnknown = 1, countTodo = 147, timeSkipped = 0.0, timeBuilt = 1.0521051175892353, timeUnknown = 0.0, timeTodo = (145.76045322418213,0)}) (5.032503,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 260, countUnknown = 1, countTodo = 145, timeSkipped = 0.0, timeBuilt = 3.678367782384157, timeUnknown = 0.0, timeTodo = (143.16193342208862,0)}) (6.339634,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 261, countUnknown = 1, countTodo = 144, timeSkipped = 0.0, timeBuilt = 6.317631412297487, timeUnknown = 0.0, timeTodo = (140.56067323684692,0)}) (7.6067605,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 262, countUnknown = 1, countTodo = 143, timeSkipped = 0.0, timeBuilt = 8.911890674382448, timeUnknown = 0.0, timeTodo = (137.99141645431519,0)}) (8.867887,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 265, countUnknown = 1, countTodo = 140, timeSkipped = 0.0, timeBuilt = 11.489148784428835, timeUnknown = 0.0, timeTodo = (135.31988191604614,0)}) (10.124012,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 266, countUnknown = 1, countTodo = 139, timeSkipped = 0.0, timeBuilt = 14.01840179041028, timeUnknown = 0.0, timeTodo = (132.74862432479858,0)}) (11.394139,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 270, countUnknown = 1, countTodo = 135, timeSkipped = 0.0, timeBuilt = 16.545654464513063, timeUnknown = 0.0, timeTodo = (130.13927960395813,0)}) (12.661266,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 278, countUnknown = 1, countTodo = 127, timeSkipped = 0.0, timeBuilt = 19.09290998056531, timeUnknown = 0.0, timeTodo = (127.51380348205566,0)}) (13.928392,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 280, countUnknown = 1, countTodo = 125, timeSkipped = 0.0, timeBuilt = 21.638164687901735, timeUnknown = 0.0, timeTodo = (124.94054412841797,0)}) (15.19352,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 283, countUnknown = 1, countTodo = 122, timeSkipped = 0.0, timeBuilt = 24.176418472081423, timeUnknown = 0.0, timeTodo = (122.32628202438354,0)}) (16.462646,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 284, countUnknown = 1, countTodo = 121, timeSkipped = 0.0, timeBuilt = 26.708671737462282, timeUnknown = 0.0, timeTodo = (119.71102046966553,0)}) (17.723772,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 286, countUnknown = 1, countTodo = 119, timeSkipped = 0.0, timeBuilt = 29.24492471292615, timeUnknown = 0.0, timeTodo = (117.13076210021973,0)}) (18.9989,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 290, countUnknown = 1, countTodo = 115, timeSkipped = 0.0, timeBuilt = 31.782178092747927, timeUnknown = 0.0, timeTodo = (114.32748413085938,0)}) (20.265026,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 294, countUnknown = 1, countTodo = 111, timeSkipped = 0.0, timeBuilt = 34.328432250767946, timeUnknown = 0.0, timeTodo = (110.95114421844482,0)}) (21.525152,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 296, countUnknown = 1, countTodo = 109, timeSkipped = 0.0, timeBuilt = 36.87168710306287, timeUnknown = 0.0, timeTodo = (108.3658857345581,0)}) (22.792278,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 297, countUnknown = 1, countTodo = 108, timeSkipped = 0.0, timeBuilt = 39.39593904092908, timeUnknown = 0.0, timeTodo = (105.79362773895264,0)}) (24.09941,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 301, countUnknown = 1, countTodo = 104, timeSkipped = 0.0, timeBuilt = 41.934190917760134, timeUnknown = 0.0, timeTodo = (103.46539211273193,0)}) (25.552555,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 307, countUnknown = 1, countTodo = 98, timeSkipped = 0.0, timeBuilt = 44.71846978738904, timeUnknown = 0.0, timeTodo = (100.06105041503906,0)}) (27.156715,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 310, countUnknown = 1, countTodo = 95, timeSkipped = 0.0, timeBuilt = 47.48474519327283, timeUnknown = 0.0, timeTodo = (97.22976875305176,0)}) (28.427843,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 311, countUnknown = 1, countTodo = 94, timeSkipped = 0.0, timeBuilt = 50.347032714635134, timeUnknown = 0.0, timeTodo = (94.5565013885498,0)}) (29.692968,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 312, countUnknown = 1, countTodo = 93, timeSkipped = 0.0, timeBuilt = 53.222318816930056, timeUnknown = 0.0, timeTodo = (91.93223762512207,0)}) (30.960096,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 314, countUnknown = 1, countTodo = 91, timeSkipped = 0.0, timeBuilt = 55.76357476785779, timeUnknown = 0.0, timeTodo = (89.2559642791748,0)}) (32.22322,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 317, countUnknown = 1, countTodo = 88, timeSkipped = 0.0, timeBuilt = 58.305828262120485, timeUnknown = 0.0, timeTodo = (86.59769439697266,0)}) (33.50235,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 321, countUnknown = 1, countTodo = 84, timeSkipped = 0.0, timeBuilt = 60.847080398350954, timeUnknown = 0.0, timeTodo = (83.98942756652832,0)}) (34.50545,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 321, countUnknown = 1, countTodo = 84, timeSkipped = 0.0, timeBuilt = 60.847080398350954, timeUnknown = 0.0, timeTodo = (83.98942756652832,0)}) (35.778576,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 327, countUnknown = 1, countTodo = 78, timeSkipped = 0.0, timeBuilt = 64.42744653299451, timeUnknown = 0.0, timeTodo = (79.1019344329834,0)}) (37.039703,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 331, countUnknown = 1, countTodo = 74, timeSkipped = 0.0, timeBuilt = 67.99280183389783, timeUnknown = 0.0, timeTodo = (75.79260063171387,0)}) (38.30183,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 332, countUnknown = 1, countTodo = 73, timeSkipped = 0.0, timeBuilt = 70.5080529935658, timeUnknown = 0.0, timeTodo = (73.16133689880371,0)}) (39.60196,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 337, countUnknown = 1, countTodo = 68, timeSkipped = 0.0, timeBuilt = 73.04030625894666, timeUnknown = 0.0, timeTodo = (70.54007148742676,0)}) (40.9971,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 341, countUnknown = 1, countTodo = 64, timeSkipped = 0.0, timeBuilt = 75.7465783841908, timeUnknown = 0.0, timeTodo = (67.84580421447754,0)}) (42.251225,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 343, countUnknown = 1, countTodo = 62, timeSkipped = 0.0, timeBuilt = 78.44684617593884, timeUnknown = 0.0, timeTodo = (64.56347465515137,0)}) (43.51035,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 348, countUnknown = 1, countTodo = 57, timeSkipped = 0.0, timeBuilt = 80.97609345987439, timeUnknown = 0.0, timeTodo = (61.85720443725586,0)}) (45.31053,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 354, countUnknown = 1, countTodo = 51, timeSkipped = 0.0, timeBuilt = 85.3115312345326, timeUnknown = 0.0, timeTodo = (56.32666015625,0)}) (46.31663,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 354, countUnknown = 1, countTodo = 51, timeSkipped = 0.0, timeBuilt = 85.3115312345326, timeUnknown = 0.0, timeTodo = (56.32666015625,0)}) (47.953796,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 357, countUnknown = 1, countTodo = 48, timeSkipped = 0.0, timeBuilt = 87.96779649332166, timeUnknown = 0.0, timeTodo = (52.5462760925293,0)}) (49.22192,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 357, countUnknown = 1, countTodo = 48, timeSkipped = 0.0, timeBuilt = 87.96779649332166, timeUnknown = 0.0, timeTodo = (52.5462760925293,0)}) (50.483047,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 358, countUnknown = 1, countTodo = 47, timeSkipped = 0.0, timeBuilt = 90.48504655435681, timeUnknown = 0.0, timeTodo = (49.943016052246094,0)}) (51.742172,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 362, countUnknown = 1, countTodo = 43, timeSkipped = 0.0, timeBuilt = 94.27242677286267, timeUnknown = 0.0, timeTodo = (46.054622650146484,0)}) (53.0143,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 364, countUnknown = 1, countTodo = 41, timeSkipped = 0.0, timeBuilt = 96.79567353799939, timeUnknown = 0.0, timeTodo = (43.455360412597656,0)}) (54.279427,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 365, countUnknown = 1, countTodo = 40, timeSkipped = 0.0, timeBuilt = 99.32792680338025, timeUnknown = 0.0, timeTodo = (40.88410568237305,0)}) (55.549553,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 368, countUnknown = 1, countTodo = 37, timeSkipped = 0.0, timeBuilt = 101.87118165567517, timeUnknown = 0.0, timeTodo = (38.3088493347168,0)}) (56.90669,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 371, countUnknown = 1, countTodo = 34, timeSkipped = 0.0, timeBuilt = 104.48744599893689, timeUnknown = 0.0, timeTodo = (35.06652069091797,0)}) (58.179817,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 372, countUnknown = 1, countTodo = 33, timeSkipped = 0.0, timeBuilt = 107.11970918253064, timeUnknown = 0.0, timeTodo = (32.46625900268555,0)}) (59.453945,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 380, countUnknown = 1, countTodo = 25, timeSkipped = 0.0, timeBuilt = 109.69997041299939, timeUnknown = 0.0, timeTodo = (29.24193572998047,0)}) (61.17312,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 387, countUnknown = 1, countTodo = 18, timeSkipped = 0.0, timeBuilt = 114.0083963163197, timeUnknown = 0.0, timeTodo = (24.391448974609375,0)}) (62.17922,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 387, countUnknown = 1, countTodo = 18, timeSkipped = 0.0, timeBuilt = 114.0083963163197, timeUnknown = 0.0, timeTodo = (24.391448974609375,0)}) (63.787376,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 390, countUnknown = 1, countTodo = 15, timeSkipped = 0.0, timeBuilt = 116.61765687540174, timeUnknown = 0.0, timeTodo = (21.717185974121094,0)}) (67.25372,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 393, countUnknown = 1, countTodo = 12, timeSkipped = 0.0, timeBuilt = 118.84489076212049, timeUnknown = 0.0, timeTodo = (19.470962524414063,0)}) (69.114914,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 399, countUnknown = 1, countTodo = 6, timeSkipped = 0.0, timeBuilt = 124.42945497110486, timeUnknown = 0.0, timeTodo = (13.024314880371094,0)}) (70.91309,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 402, countUnknown = 1, countTodo = 3, timeSkipped = 0.0, timeBuilt = 127.54076020792127, timeUnknown = 0.0, timeTodo = (10.791084289550781,0)}) (72.67527,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 404, countUnknown = 1, countTodo = 1, timeSkipped = 0.0, timeBuilt = 129.2949411161244, timeUnknown = 0.0, timeTodo = (8.603858947753906,0)}) (73.68037,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 404, countUnknown = 1, countTodo = 1, timeSkipped = 0.0, timeBuilt = 129.2949411161244, timeUnknown = 0.0, timeTodo = (8.603858947753906,0)}) (74.683464,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 404, countUnknown = 1, countTodo = 1, timeSkipped = 0.0, timeBuilt = 129.2949411161244, timeUnknown = 0.0, timeTodo = (8.603858947753906,0)}) (75.68657,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 404, countUnknown = 1, countTodo = 1, timeSkipped = 0.0, timeBuilt = 129.2949411161244, timeUnknown = 0.0, timeTodo = (8.603858947753906,0)}) (76.68967,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 404, countUnknown = 1, countTodo = 1, timeSkipped = 0.0, timeBuilt = 129.2949411161244, timeUnknown = 0.0, timeTodo = (8.603858947753906,0)}) (77.69477,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 404, countUnknown = 1, countTodo = 1, timeSkipped = 0.0, timeBuilt = 129.2949411161244, timeUnknown = 0.0, timeTodo = (8.603858947753906,0)}) shake-0.16.4/src/Test/Progress/self-rebuild-j2.prog0000644000000000000000000002753613261223302020163 0ustar0000000000000000(3.6053605,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 258, countUnknown = 1, countTodo = 147, timeSkipped = 0.0, timeBuilt = 1.054105345858261, timeUnknown = 0.0, timeTodo = (135.01951599121094,0)}) (4.8964896,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 259, countUnknown = 1, countTodo = 146, timeSkipped = 0.0, timeBuilt = 3.6373634978663176, timeUnknown = 0.0, timeTodo = (132.3802523612976,0)}) (6.189619,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 260, countUnknown = 1, countTodo = 145, timeSkipped = 0.0, timeBuilt = 6.2686262771021575, timeUnknown = 0.0, timeTodo = (129.85099935531616,0)}) (7.4677467,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 262, countUnknown = 1, countTodo = 143, timeSkipped = 0.0, timeBuilt = 8.854884450091049, timeUnknown = 0.0, timeTodo = (127.25373983383179,0)}) (8.742874,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 268, countUnknown = 1, countTodo = 137, timeSkipped = 0.0, timeBuilt = 11.444143597735092, timeUnknown = 0.0, timeTodo = (124.70048475265503,0)}) (10.016002,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 271, countUnknown = 1, countTodo = 134, timeSkipped = 0.0, timeBuilt = 16.5466540500056, timeUnknown = 0.0, timeTodo = (119.64597797393799,0)}) (11.30813,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 271, countUnknown = 1, countTodo = 134, timeSkipped = 0.0, timeBuilt = 16.5466540500056, timeUnknown = 0.0, timeTodo = (119.64597797393799,0)}) (12.59526,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 276, countUnknown = 1, countTodo = 129, timeSkipped = 0.0, timeBuilt = 19.123910252703354, timeUnknown = 0.0, timeTodo = (117.0657205581665,0)}) (13.876388,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 278, countUnknown = 1, countTodo = 127, timeSkipped = 0.0, timeBuilt = 21.708168332232162, timeUnknown = 0.0, timeTodo = (114.43845868110657,0)}) (15.173517,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 280, countUnknown = 1, countTodo = 125, timeSkipped = 0.0, timeBuilt = 24.27642471040599, timeUnknown = 0.0, timeTodo = (111.89620327949524,0)}) (16.456646,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 282, countUnknown = 1, countTodo = 123, timeSkipped = 0.0, timeBuilt = 26.85868198121898, timeUnknown = 0.0, timeTodo = (109.3659496307373,0)}) (17.726772,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 284, countUnknown = 1, countTodo = 121, timeSkipped = 0.0, timeBuilt = 29.442940060747787, timeUnknown = 0.0, timeTodo = (106.83069705963135,0)}) (19.027903,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 288, countUnknown = 1, countTodo = 117, timeSkipped = 0.0, timeBuilt = 32.00519592012279, timeUnknown = 0.0, timeTodo = (104.28944301605225,0)}) (20.345034,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 292, countUnknown = 1, countTodo = 113, timeSkipped = 0.0, timeBuilt = 34.586452786577865, timeUnknown = 0.0, timeTodo = (101.74619007110596,0)}) (21.648165,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 300, countUnknown = 1, countTodo = 105, timeSkipped = 0.0, timeBuilt = 37.32872611726634, timeUnknown = 0.0, timeTodo = (99.18993377685547,0)}) (23.717371,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 301, countUnknown = 1, countTodo = 104, timeSkipped = 0.0, timeBuilt = 40.75507003511302, timeUnknown = 0.0, timeTodo = (96.41965675354004,0)}) (25.022503,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 303, countUnknown = 1, countTodo = 102, timeSkipped = 0.0, timeBuilt = 44.12840873445384, timeUnknown = 0.0, timeTodo = (93.89340400695801,0)}) (26.289629,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 304, countUnknown = 1, countTodo = 101, timeSkipped = 0.0, timeBuilt = 46.6976664706599, timeUnknown = 0.0, timeTodo = (91.36415100097656,0)}) (27.576757,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 305, countUnknown = 1, countTodo = 100, timeSkipped = 0.0, timeBuilt = 49.27292472566478, timeUnknown = 0.0, timeTodo = (88.50186347961426,0)}) (29.49495,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 308, countUnknown = 1, countTodo = 97, timeSkipped = 0.0, timeBuilt = 52.47924644197337, timeUnknown = 0.0, timeTodo = (84.93850708007813,0)}) (31.11511,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 311, countUnknown = 1, countTodo = 94, timeSkipped = 0.0, timeBuilt = 56.04560119356029, timeUnknown = 0.0, timeTodo = (81.36915397644043,0)}) (32.42124,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 313, countUnknown = 1, countTodo = 92, timeSkipped = 0.0, timeBuilt = 58.938888852251694, timeUnknown = 0.0, timeTodo = (78.84289932250977,0)}) (33.68837,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 315, countUnknown = 1, countTodo = 90, timeSkipped = 0.0, timeBuilt = 61.51514465059154, timeUnknown = 0.0, timeTodo = (75.96561431884766,0)}) (34.961494,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 322, countUnknown = 1, countTodo = 83, timeSkipped = 0.0, timeBuilt = 64.10740501130931, timeUnknown = 0.0, timeTodo = (73.41636085510254,0)}) (36.245625,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 330, countUnknown = 1, countTodo = 75, timeSkipped = 0.0, timeBuilt = 66.67165786470287, timeUnknown = 0.0, timeTodo = (70.8551082611084,0)}) (37.545753,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 331, countUnknown = 1, countTodo = 74, timeSkipped = 0.0, timeBuilt = 69.22791320527904, timeUnknown = 0.0, timeTodo = (68.3188533782959,0)}) (38.944893,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 336, countUnknown = 1, countTodo = 69, timeSkipped = 0.0, timeBuilt = 71.91318160737865, timeUnknown = 0.0, timeTodo = (65.60857582092285,0)}) (40.345036,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 337, countUnknown = 1, countTodo = 68, timeSkipped = 0.0, timeBuilt = 74.72146255220287, timeUnknown = 0.0, timeTodo = (62.849300384521484,0)}) (41.651165,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 340, countUnknown = 1, countTodo = 65, timeSkipped = 0.0, timeBuilt = 77.43573409761302, timeUnknown = 0.0, timeTodo = (60.147029876708984,0)}) (42.9913,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 345, countUnknown = 1, countTodo = 60, timeSkipped = 0.0, timeBuilt = 80.11199981416576, timeUnknown = 0.0, timeTodo = (57.615779876708984,0)}) (44.28843,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 347, countUnknown = 1, countTodo = 58, timeSkipped = 0.0, timeBuilt = 82.76326781953685, timeUnknown = 0.0, timeTodo = (55.096527099609375,0)}) (45.844585,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 350, countUnknown = 1, countTodo = 55, timeSkipped = 0.0, timeBuilt = 85.57555038179271, timeUnknown = 0.0, timeTodo = (53.285343170166016,0)}) (46.848686,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 354, countUnknown = 1, countTodo = 51, timeSkipped = 0.0, timeBuilt = 88.44083816255443, timeUnknown = 0.0, timeTodo = (50.76008987426758,0)}) (47.852783,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 354, countUnknown = 1, countTodo = 51, timeSkipped = 0.0, timeBuilt = 88.44083816255443, timeUnknown = 0.0, timeTodo = (50.76008987426758,0)}) (50.848083,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 357, countUnknown = 1, countTodo = 48, timeSkipped = 0.0, timeBuilt = 92.16721374238841, timeUnknown = 0.0, timeTodo = (48.103824615478516,0)}) (52.852283,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 358, countUnknown = 1, countTodo = 47, timeSkipped = 0.0, timeBuilt = 95.5145495578181, timeUnknown = 0.0, timeTodo = (45.494564056396484,0)}) (54.130413,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 362, countUnknown = 1, countTodo = 43, timeSkipped = 0.0, timeBuilt = 100.13300735200755, timeUnknown = 0.0, timeTodo = (41.70818328857422,0)}) (55.41054,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 363, countUnknown = 1, countTodo = 42, timeSkipped = 0.0, timeBuilt = 102.67526275361888, timeUnknown = 0.0, timeTodo = (39.17192840576172,0)}) (56.414642,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 369, countUnknown = 1, countTodo = 36, timeSkipped = 0.0, timeBuilt = 105.25251800264232, timeUnknown = 0.0, timeTodo = (36.63368225097656,0)}) (59.073906,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 372, countUnknown = 1, countTodo = 33, timeSkipped = 0.0, timeBuilt = 107.67676574434154, timeUnknown = 0.0, timeTodo = (34.10843276977539,0)}) (60.356033,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 374, countUnknown = 1, countTodo = 31, timeSkipped = 0.0, timeBuilt = 110.21202117647044, timeUnknown = 0.0, timeTodo = (31.546173095703125,0)}) (61.641163,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 375, countUnknown = 1, countTodo = 30, timeSkipped = 0.0, timeBuilt = 114.02240401948802, timeUnknown = 0.0, timeTodo = (28.913909912109375,0)}) (62.92029,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 381, countUnknown = 1, countTodo = 24, timeSkipped = 0.0, timeBuilt = 116.60766250337474, timeUnknown = 0.0, timeTodo = (26.366653442382813,0)}) (65.09751,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 387, countUnknown = 1, countTodo = 18, timeSkipped = 0.0, timeBuilt = 121.40914184297435, timeUnknown = 0.0, timeTodo = (22.06322479248047,0)}) (66.10161,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 387, countUnknown = 1, countTodo = 18, timeSkipped = 0.0, timeBuilt = 121.40914184297435, timeUnknown = 0.0, timeTodo = (22.06322479248047,0)}) (67.678764,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 390, countUnknown = 1, countTodo = 15, timeSkipped = 0.0, timeBuilt = 123.98239928926341, timeUnknown = 0.0, timeTodo = (19.453964233398438,0)}) (71.20812,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 393, countUnknown = 1, countTodo = 12, timeSkipped = 0.0, timeBuilt = 126.19761687959544, timeUnknown = 0.0, timeTodo = (17.226730346679688,0)}) (72.21122,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 396, countUnknown = 1, countTodo = 9, timeSkipped = 0.0, timeBuilt = 128.785888974322, timeUnknown = 0.0, timeTodo = (14.163421630859375,0)}) (73.53935,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 399, countUnknown = 1, countTodo = 6, timeSkipped = 0.0, timeBuilt = 133.6913836642634, timeUnknown = 0.0, timeTodo = (11.642166137695313,0)}) (75.757576,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 402, countUnknown = 1, countTodo = 3, timeSkipped = 0.0, timeBuilt = 135.9266131564509, timeUnknown = 0.0, timeTodo = (8.530860900878906,0)}) (77.91779,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 404, countUnknown = 1, countTodo = 1, timeSkipped = 0.0, timeBuilt = 138.08883697236888, timeUnknown = 0.0, timeTodo = (6.776679992675781,0)}) (78.92289,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 404, countUnknown = 1, countTodo = 1, timeSkipped = 0.0, timeBuilt = 138.08883697236888, timeUnknown = 0.0, timeTodo = (6.776679992675781,0)}) (79.925995,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 404, countUnknown = 1, countTodo = 1, timeSkipped = 0.0, timeBuilt = 138.08883697236888, timeUnknown = 0.0, timeTodo = (6.776679992675781,0)}) (80.93109,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 404, countUnknown = 1, countTodo = 1, timeSkipped = 0.0, timeBuilt = 138.08883697236888, timeUnknown = 0.0, timeTodo = (6.776679992675781,0)}) (81.93419,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 404, countUnknown = 1, countTodo = 1, timeSkipped = 0.0, timeBuilt = 138.08883697236888, timeUnknown = 0.0, timeTodo = (6.776679992675781,0)}) (82.93929,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 404, countUnknown = 1, countTodo = 1, timeSkipped = 0.0, timeBuilt = 138.08883697236888, timeUnknown = 0.0, timeTodo = (6.776679992675781,0)}) shake-0.16.4/src/Test/Progress/self-clean-j2.prog0000644000000000000000000003047413261223302017612 0ustar0000000000000000(1.120224,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 254, countUnknown = 0, countTodo = 145, timeSkipped = 0.0, timeBuilt = 0.8861771221272647, timeUnknown = 0.0, timeTodo = (0.0,145)}) (2.4994998,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 258, countUnknown = 0, countTodo = 147, timeSkipped = 0.0, timeBuilt = 1.259251721901819, timeUnknown = 0.0, timeTodo = (0.0,147)}) (5.156031,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 259, countUnknown = 0, countTodo = 146, timeSkipped = 0.0, timeBuilt = 3.8557707152795047, timeUnknown = 0.0, timeTodo = (0.0,146)}) (6.4472423,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 261, countUnknown = 0, countTodo = 144, timeSkipped = 0.0, timeBuilt = 6.519303449196741, timeUnknown = 0.0, timeTodo = (0.0,144)}) (7.737371,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 262, countUnknown = 0, countTodo = 143, timeSkipped = 0.0, timeBuilt = 9.126777537865564, timeUnknown = 0.0, timeTodo = (0.0,143)}) (9.061503,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 267, countUnknown = 0, countTodo = 138, timeSkipped = 0.0, timeBuilt = 11.739121564431116, timeUnknown = 0.0, timeTodo = (0.0,138)}) (10.368634,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 268, countUnknown = 0, countTodo = 137, timeSkipped = 0.0, timeBuilt = 14.354383119149134, timeUnknown = 0.0, timeTodo = (0.0,137)}) (11.658763,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 272, countUnknown = 0, countTodo = 133, timeSkipped = 0.0, timeBuilt = 16.976646550698206, timeUnknown = 0.0, timeTodo = (0.0,133)}) (12.941892,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 274, countUnknown = 0, countTodo = 131, timeSkipped = 0.0, timeBuilt = 19.57990659098141, timeUnknown = 0.0, timeTodo = (0.0,131)}) (14.22602,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 279, countUnknown = 0, countTodo = 126, timeSkipped = 0.0, timeBuilt = 22.17116750101559, timeUnknown = 0.0, timeTodo = (0.0,126)}) (15.511148,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 282, countUnknown = 0, countTodo = 123, timeSkipped = 0.0, timeBuilt = 24.74242699961178, timeUnknown = 0.0, timeTodo = (0.0,123)}) (16.794277,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 283, countUnknown = 0, countTodo = 122, timeSkipped = 0.0, timeBuilt = 27.31368459085934, timeUnknown = 0.0, timeTodo = (0.0,122)}) (18.085405,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 288, countUnknown = 0, countTodo = 117, timeSkipped = 0.0, timeBuilt = 29.8999453864526, timeUnknown = 0.0, timeTodo = (0.0,117)}) (19.398537,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 294, countUnknown = 0, countTodo = 111, timeSkipped = 0.0, timeBuilt = 32.487208493752405, timeUnknown = 0.0, timeTodo = (0.0,111)}) (21.451742,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 295, countUnknown = 0, countTodo = 110, timeSkipped = 0.0, timeBuilt = 35.872547277016565, timeUnknown = 0.0, timeTodo = (0.0,110)}) (22.761873,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 297, countUnknown = 0, countTodo = 108, timeSkipped = 0.0, timeBuilt = 39.24288571695797, timeUnknown = 0.0, timeTodo = (0.0,108)}) (24.060003,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 298, countUnknown = 0, countTodo = 107, timeSkipped = 0.0, timeBuilt = 41.825143941445276, timeUnknown = 0.0, timeTodo = (0.0,107)}) (25.555153,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 304, countUnknown = 0, countTodo = 101, timeSkipped = 0.0, timeBuilt = 44.66642964701168, timeUnknown = 0.0, timeTodo = (0.0,101)}) (26.868284,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 305, countUnknown = 0, countTodo = 100, timeSkipped = 0.0, timeBuilt = 47.460708745522425, timeUnknown = 0.0, timeTodo = (0.0,100)}) (28.22642,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 306, countUnknown = 0, countTodo = 99, timeSkipped = 0.0, timeBuilt = 50.111974843544886, timeUnknown = 0.0, timeTodo = (0.0,99)}) (29.52355,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 308, countUnknown = 0, countTodo = 97, timeSkipped = 0.0, timeBuilt = 52.788244374794886, timeUnknown = 0.0, timeTodo = (0.0,97)}) (30.82868,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 309, countUnknown = 0, countTodo = 96, timeSkipped = 0.0, timeBuilt = 55.38850606302731, timeUnknown = 0.0, timeTodo = (0.0,96)}) (31.83378,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 309, countUnknown = 0, countTodo = 96, timeSkipped = 0.0, timeBuilt = 55.38850606302731, timeUnknown = 0.0, timeTodo = (0.0,96)}) (33.099907,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 311, countUnknown = 0, countTodo = 94, timeSkipped = 0.0, timeBuilt = 57.70873845438473, timeUnknown = 0.0, timeTodo = (0.0,94)}) (34.455044,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 312, countUnknown = 0, countTodo = 93, timeSkipped = 0.0, timeBuilt = 62.583227284951136, timeUnknown = 0.0, timeTodo = (0.0,93)}) (35.775173,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 323, countUnknown = 0, countTodo = 82, timeSkipped = 0.0, timeBuilt = 65.22149098734371, timeUnknown = 0.0, timeTodo = (0.0,82)}) (37.087307,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 324, countUnknown = 0, countTodo = 81, timeSkipped = 0.0, timeBuilt = 67.89576161722653, timeUnknown = 0.0, timeTodo = (0.0,81)}) (38.39644,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 332, countUnknown = 0, countTodo = 73, timeSkipped = 0.0, timeBuilt = 70.54603017191403, timeUnknown = 0.0, timeTodo = (0.0,73)}) (39.705566,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 336, countUnknown = 0, countTodo = 69, timeSkipped = 0.0, timeBuilt = 73.18129933695309, timeUnknown = 0.0, timeTodo = (0.0,69)}) (41.676765,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 339, countUnknown = 0, countTodo = 66, timeSkipped = 0.0, timeBuilt = 76.48663152079098, timeUnknown = 0.0, timeTodo = (0.0,66)}) (42.969894,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 342, countUnknown = 0, countTodo = 63, timeSkipped = 0.0, timeBuilt = 79.77195943216793, timeUnknown = 0.0, timeTodo = (0.0,63)}) (44.364033,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 345, countUnknown = 0, countTodo = 60, timeSkipped = 0.0, timeBuilt = 82.46222890238278, timeUnknown = 0.0, timeTodo = (0.0,60)}) (45.367134,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 348, countUnknown = 0, countTodo = 57, timeSkipped = 0.0, timeBuilt = 85.15849507669918, timeUnknown = 0.0, timeTodo = (0.0,57)}) (47.0483,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 351, countUnknown = 0, countTodo = 54, timeSkipped = 0.0, timeBuilt = 87.8587666831445, timeUnknown = 0.0, timeTodo = (0.0,54)}) (48.052402,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 351, countUnknown = 0, countTodo = 54, timeSkipped = 0.0, timeBuilt = 87.8587666831445, timeUnknown = 0.0, timeTodo = (0.0,54)}) (49.890587,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 354, countUnknown = 0, countTodo = 51, timeSkipped = 0.0, timeBuilt = 90.69304478983395, timeUnknown = 0.0, timeTodo = (0.0,51)}) (50.894688,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 354, countUnknown = 0, countTodo = 51, timeSkipped = 0.0, timeBuilt = 90.69304478983395, timeUnknown = 0.0, timeTodo = (0.0,51)}) (51.897785,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 354, countUnknown = 0, countTodo = 51, timeSkipped = 0.0, timeBuilt = 90.69304478983395, timeUnknown = 0.0, timeTodo = (0.0,51)}) (54.92909,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 357, countUnknown = 0, countTodo = 48, timeSkipped = 0.0, timeBuilt = 94.47342885355465, timeUnknown = 0.0, timeTodo = (0.0,48)}) (56.23222,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 358, countUnknown = 0, countTodo = 47, timeSkipped = 0.0, timeBuilt = 97.04368794779293, timeUnknown = 0.0, timeTodo = (0.0,47)}) (57.53035,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 360, countUnknown = 0, countTodo = 45, timeSkipped = 0.0, timeBuilt = 100.92107594828121, timeUnknown = 0.0, timeTodo = (0.0,45)}) (58.82748,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 361, countUnknown = 0, countTodo = 44, timeSkipped = 0.0, timeBuilt = 103.52433598856442, timeUnknown = 0.0, timeTodo = (0.0,44)}) (60.104607,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 368, countUnknown = 0, countTodo = 37, timeSkipped = 0.0, timeBuilt = 106.14560521463864, timeUnknown = 0.0, timeTodo = (0.0,37)}) (62.0268,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 369, countUnknown = 0, countTodo = 36, timeSkipped = 0.0, timeBuilt = 109.38093007425778, timeUnknown = 0.0, timeTodo = (0.0,36)}) (63.33293,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 372, countUnknown = 0, countTodo = 33, timeSkipped = 0.0, timeBuilt = 112.58624852518551, timeUnknown = 0.0, timeTodo = (0.0,33)}) (64.62606,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 374, countUnknown = 0, countTodo = 31, timeSkipped = 0.0, timeBuilt = 115.16150487284176, timeUnknown = 0.0, timeTodo = (0.0,31)}) (65.925186,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 379, countUnknown = 0, countTodo = 26, timeSkipped = 0.0, timeBuilt = 117.77476704935543, timeUnknown = 0.0, timeTodo = (0.0,26)}) (68.11641,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 387, countUnknown = 0, countTodo = 18, timeSkipped = 0.0, timeBuilt = 122.62825597147457, timeUnknown = 0.0, timeTodo = (0.0,18)}) (69.121506,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 387, countUnknown = 0, countTodo = 18, timeSkipped = 0.0, timeBuilt = 122.62825597147457, timeUnknown = 0.0, timeTodo = (0.0,18)}) (70.77867,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 390, countUnknown = 0, countTodo = 15, timeSkipped = 0.0, timeBuilt = 125.30251897196285, timeUnknown = 0.0, timeTodo = (0.0,15)}) (74.353035,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 393, countUnknown = 0, countTodo = 12, timeSkipped = 0.0, timeBuilt = 127.54874242166989, timeUnknown = 0.0, timeTodo = (0.0,12)}) (75.35613,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 396, countUnknown = 0, countTodo = 9, timeSkipped = 0.0, timeBuilt = 130.15401280741207, timeUnknown = 0.0, timeTodo = (0.0,9)}) (76.85028,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 399, countUnknown = 0, countTodo = 6, timeSkipped = 0.0, timeBuilt = 133.99539006571285, timeUnknown = 0.0, timeTodo = (0.0,6)}) (79.081505,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 402, countUnknown = 0, countTodo = 3, timeSkipped = 0.0, timeBuilt = 136.22862065653317, timeUnknown = 0.0, timeTodo = (0.0,3)}) (81.25272,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 404, countUnknown = 0, countTodo = 1, timeSkipped = 0.0, timeBuilt = 138.41584599833004, timeUnknown = 0.0, timeTodo = (0.0,1)}) (82.260826,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 404, countUnknown = 0, countTodo = 1, timeSkipped = 0.0, timeBuilt = 138.41584599833004, timeUnknown = 0.0, timeTodo = (0.0,1)}) (83.26492,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 404, countUnknown = 0, countTodo = 1, timeSkipped = 0.0, timeBuilt = 138.41584599833004, timeUnknown = 0.0, timeTodo = (0.0,1)}) (84.27003,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 404, countUnknown = 0, countTodo = 1, timeSkipped = 0.0, timeBuilt = 138.41584599833004, timeUnknown = 0.0, timeTodo = (0.0,1)}) (85.274124,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 404, countUnknown = 0, countTodo = 1, timeSkipped = 0.0, timeBuilt = 138.41584599833004, timeUnknown = 0.0, timeTodo = (0.0,1)}) (86.28023,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 404, countUnknown = 0, countTodo = 1, timeSkipped = 0.0, timeBuilt = 138.41584599833004, timeUnknown = 0.0, timeTodo = (0.0,1)}) (87.284325,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 404, countUnknown = 0, countTodo = 1, timeSkipped = 0.0, timeBuilt = 138.41584599833004, timeUnknown = 0.0, timeTodo = (0.0,1)}) (88.28843,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 404, countUnknown = 0, countTodo = 1, timeSkipped = 0.0, timeBuilt = 138.41584599833004, timeUnknown = 0.0, timeTodo = (0.0,1)}) shake-0.16.4/src/Test/Progress/progress-nan.prog0000644000000000000000000000137413261223302017703 0ustar0000000000000000(5.215521335601807,Progress {isFailure = Nothing, countSkipped = 5028, countBuilt = 0, countUnknown = 36224, countTodo = 1837, timeSkipped = 274149.05088049243, timeBuilt = 0.0, timeUnknown = 31144.710513075814, timeTodo = (4301.8477063977625,1)}) (12.997299194335938,Progress {isFailure = Nothing, countSkipped = 12154, countBuilt = 0, countUnknown = 23682, countTodo = 7253, timeSkipped = 274894.82577174786, timeBuilt = 0.0, timeUnknown = 23933.863898285897, timeTodo = (10766.919429932255,1)}) (18.471847534179688,Progress {isFailure = Nothing, countSkipped = 18273, countBuilt = 27, countUnknown = 13862, countTodo = 10931, timeSkipped = 275406.36595057615, timeBuilt = 1.7951794527471066, timeUnknown = 17710.551982904668, timeTodo = (16478.41014388157,4)}) shake-0.16.4/src/Test/Ninja/0000755000000000000000000000000013261223302013622 5ustar0000000000000000shake-0.16.4/src/Test/Ninja/test7.ninja0000644000000000000000000000021013261223302015702 0ustar0000000000000000ninja_required_version = 1.5 build test : phony a b rule CUSTOM_COMMAND command = $COMMAND build a b : CUSTOM_COMMAND COMMAND = cd shake-0.16.4/src/Test/Ninja/test6.ninja0000644000000000000000000000007113261223302015706 0ustar0000000000000000v2 = g1 include ${v1}-inc.ninja subninja ${v1}-sub.ninja shake-0.16.4/src/Test/Ninja/test6-sub.ninja0000644000000000000000000000001013261223302016466 0ustar0000000000000000v2 = g3 shake-0.16.4/src/Test/Ninja/test6-inc.ninja0000644000000000000000000000001013261223302016446 0ustar0000000000000000v2 = g2 shake-0.16.4/src/Test/Ninja/test5.ninja0000644000000000000000000000007413261223302015710 0ustar0000000000000000 rule run command = touch $out build output$ file: run shake-0.16.4/src/Test/Ninja/test4.ninja0000644000000000000000000000017213261223302015706 0ustar0000000000000000 rule run command = touch $out build ./out.txt: run build dir/../out2.txt: run build out: phony ./out.txt out2.txt shake-0.16.4/src/Test/Ninja/test3.ninja0000644000000000000000000000034513261223302015707 0ustar0000000000000000 v1 = g1 v5 = g1 rule dump command = echo $v1+$v2+$v3+$v4+$v5 > $out v1 = g2 build out3.1: dump v2 = b1 v1 = g3 subninja test3-sub.ninja include test3-inc.ninja build out3.2: dump v1 = g4 include subdir/1.ninja shake-0.16.4/src/Test/Ninja/test3-sub.ninja0000644000000000000000000000005513261223302016474 0ustar0000000000000000v4 = s1 v5 = s1 build out3.4: dump v5 = s2 shake-0.16.4/src/Test/Ninja/test3-inc.ninja0000644000000000000000000000003413261223302016451 0ustar0000000000000000v5 = i1 build out3.3: dump shake-0.16.4/src/Test/Ninja/test2.ninja0000644000000000000000000000013013261223302015676 0ustar0000000000000000 rule run command = touch $out build out2.1: run build out2.2: run default out2.2 shake-0.16.4/src/Test/Ninja/test1.ninja0000644000000000000000000000007013261223302015700 0ustar0000000000000000 rule run command = touch $out build out1.txt: run shake-0.16.4/src/Test/Ninja/restart.ninja0000644000000000000000000000020113261223302016320 0ustar0000000000000000 rule add command = echo build restart.txt: self >> $out build restart.ninja: add rule self command = echo $out > $out shake-0.16.4/src/Test/Ninja/redefine.ninja0000644000000000000000000000031513261223302016423 0ustar0000000000000000 vGlobal = version1 rule record # should produce version3 version2 command = echo $vGlobal $vBuild > $out vGlobal = version2 build redefine.txt: record vBuild = $vGlobal vGlobal = version3 shake-0.16.4/src/Test/Ninja/phonyorder.ninja0000644000000000000000000000025413261223302017035 0ustar0000000000000000 rule create command = touch $out rule copy command = cp $from $out build Foo2: phony || foo.txt build bar.txt: copy || Foo2 from = foo.txt build foo.txt: create shake-0.16.4/src/Test/Ninja/outputtouch.ninja0000644000000000000000000000011413261223302017242 0ustar0000000000000000 rule record command = echo hello > $out build outputtouch.txt: record shake-0.16.4/src/Test/Ninja/nocreate.ninja0000644000000000000000000000012413261223302016440 0ustar0000000000000000 rule gen command = echo x >> nocreate.log build nocreate.out: gen nocreate.in shake-0.16.4/src/Test/Ninja/lint.ninja0000644000000000000000000000054313261223302015613 0ustar0000000000000000 rule gen command = echo $out > $out rule run command = (echo $out : $out.gen > $out.d) && (echo $out > $out) depfile = $out.d build good: phony good1 good2 build good1: run || good1.gen build good1.gen: gen build good2: run || good2_phony build good2_phony: phony good2.gen build good2.gen: gen build bad: run | input build bad.gen: gen shake-0.16.4/src/Test/Ninja/lexical.ninja0000644000000000000000000000020113261223302016255 0ustar0000000000000000 rule test.run command = echo ${foo.bar}$foo.bar > $out build lexical.txt: test.run foo.bar = XFoo_BarX foo = XFooX shake-0.16.4/src/Test/Ninja/continuations.ninja0000644000000000000000000000015413261223302017540 0ustar0000000000000000 rule $ run command $ = $ touch $ $out build $ continuations.txt $ : $ run shake-0.16.4/src/Test/Ninja/compdb.output0000644000000000000000000000204213261223302016346 0ustar0000000000000000[ { "directory": "*", "command": "g++ -MMD -MT build\\build.o -MF build\\build.o.d -g -Wall -Wextra -Wno-deprecated -Wno-unused-parameter -fno-rtti -fno-exceptions -pipe -Wno-missing-field-initializers -DNINJA_PYTHON=\"python.exe\" -O2 -DNDEBUG -D_WIN32_WINNT=0x0501 -c src\\build.cc -o build\\build.o", "file": "src\\build.cc" }, { "directory": "*", "command": "g++ -MMD -MT build\\build_log.o -MF build\\build_log.o.d -g -Wall -Wextra -Wno-deprecated -Wno-unused-parameter -fno-rtti -fno-exceptions -pipe -Wno-missing-field-initializers -DNINJA_PYTHON=\"python.exe\" -O2 -DNDEBUG -D_WIN32_WINNT=0x0501 -c src\\build_log.cc -o build\\build_log.o", "file": "src\\build_log.cc" }, { "directory": "*", "command": "g++ -MMD -MT build\\clean.o -MF build\\clean.o.d -g -Wall -Wextra -Wno-deprecated -Wno-unused-parameter -fno-rtti -fno-exceptions -pipe -Wno-missing-field-initializers -DNINJA_PYTHON=\"python.exe\" -O2 -DNDEBUG -D_WIN32_WINNT=0x0501 -c src\\clean.cc -o build\\clean.o", "file": "src\\clean.cc" } ] shake-0.16.4/src/Test/Ninja/compdb.ninja0000644000000000000000000000244313261223302016112 0ustar0000000000000000# Copied from the Ninja repo ninja_required_version = 1.3 builddir = build cxx = g++ ar = ar cflags = -g -Wall -Wextra -Wno-deprecated -Wno-unused-parameter -fno-rtti $ -fno-exceptions -pipe -Wno-missing-field-initializers $ -DNINJA_PYTHON="python.exe" -O2 -DNDEBUG -D_WIN32_WINNT=0x0501 ldflags = -L$builddir -static rule cxx command = $cxx -MMD -MT $out -MF $out.d $cflags -c $in -o $out description = CXX $out depfile = $out.d deps = gcc rule ar command = cmd /c $ar cqs $out.tmp $in && move /Y $out.tmp $out description = AR $out rule link command = $cxx $ldflags -o $out $in $libs description = LINK $out # the depfile parser and ninja lexers are generated using re2c. # Core source files all build into ninja library. build $builddir\build.o: cxx src\build.cc build $builddir\build_log.o: cxx src\build_log.cc build $builddir\clean.o: cxx src\clean.cc rule doxygen command = doxygen $in description = DOXYGEN $in doxygen_mainpage_generator = src\gen_doxygen_mainpage.sh rule doxygen_mainpage command = $doxygen_mainpage_generator $in > $out description = DOXYGEN_MAINPAGE $out build $builddir\doxygen_mainpage: doxygen_mainpage README COPYING | $ $doxygen_mainpage_generator build doxygen: doxygen doc\doxygen.config | $builddir\doxygen_mainpage default ninja.exe shake-0.16.4/src/Test/Ninja/buildseparate.ninja0000644000000000000000000000026713261223302017474 0ustar0000000000000000 rule record # produces XX command = hello command = echo $vBar $command > $out build buildseparate.txt: record vFoo = foo vBar = bar vBar = X${vFoo}${vBar}X shake-0.16.4/src/Test/Ninja/subdir/0000755000000000000000000000000013261223302015112 5ustar0000000000000000shake-0.16.4/src/Test/Ninja/subdir/2.ninja0000644000000000000000000000000013261223302016262 0ustar0000000000000000shake-0.16.4/src/Test/Ninja/subdir/1.ninja0000644000000000000000000000012713261223302016273 0ustar0000000000000000# weirdly, Ninja includes are not relative to who includes them include subdir/2.ninja shake-0.16.4/src/Test/C/0000755000000000000000000000000013261223302012745 5ustar0000000000000000shake-0.16.4/src/Test/C/main.c0000644000000000000000000000014113261223302014031 0ustar0000000000000000#include #include "constants.h" int main() { printf("%s\n", message()); return 0; } shake-0.16.4/src/Test/C/constants.h0000644000000000000000000000002113261223302015123 0ustar0000000000000000char* message(); shake-0.16.4/src/Test/C/constants.c0000644000000000000000000000010613261223302015122 0ustar0000000000000000 char msg[] = "Hello Shake Users!"; char* message() { return msg; } shake-0.16.4/src/General/0000755000000000000000000000000013261223301013220 5ustar0000000000000000shake-0.16.4/src/General/Timing.hs0000644000000000000000000000333413261223301015006 0ustar0000000000000000 module General.Timing(resetTimings, addTiming, printTimings) where import Data.IORef import System.IO.Unsafe import Numeric.Extra import System.Time.Extra {-# NOINLINE timer #-} timer :: IO Seconds timer = unsafePerformIO offsetTime {-# NOINLINE timings #-} timings :: IORef [(Seconds, String)] -- number of times called, newest first timings = unsafePerformIO $ newIORef [] resetTimings :: IO () resetTimings = do now <- timer writeIORef timings [(now, "Start")] -- | Print all withTiming information and clear the information. printTimings :: IO () printTimings = do now <- timer old <- atomicModifyIORef timings $ \ts -> ([(now, "Start")], ts) putStr $ unlines $ showTimings now $ reverse old addTiming :: String -> IO () addTiming msg = do now <- timer atomicModifyIORef timings $ \ts -> ((now,msg):ts, ()) showTimings :: Seconds -> [(Seconds, String)] -> [String] showTimings _ [] = [] showTimings stop times = showGap $ [(a ++ " ", showDP 3 b ++ "s " ++ showPerc b ++ " " ++ progress b) | (a,b) <- xs] ++ [("Total", showDP 3 sm ++ "s " ++ showPerc sm ++ " " ++ replicate 25 ' ')] where a // b = if b == 0 then 0 else a / b showPerc x = let s = show $ floor $ x * 100 // sm in replicate (3 - length s) ' ' ++ s ++ "%" progress x = let i = floor $ x * 25 // mx in replicate i '=' ++ replicate (25-i) ' ' mx = maximum $ map snd xs sm = sum $ map snd xs xs = [ (name, stop - start) | ((start, name), stop) <- zip times $ map fst (drop 1 times) ++ [stop]] showGap :: [(String,String)] -> [String] showGap xs = [a ++ replicate (n - length a - length b) ' ' ++ b | (a,b) <- xs] where n = maximum [length a + length b | (a,b) <- xs] shake-0.16.4/src/General/Template.hs0000644000000000000000000000416113261223301015331 0ustar0000000000000000{-# LANGUAGE ViewPatterns #-} module General.Template(runTemplate) where import System.FilePath.Posix import Control.Exception.Extra import Control.Monad.IO.Class import Data.Char import qualified Data.ByteString.Lazy.Char8 as LBS import qualified Language.Javascript.Flot as Flot import qualified Language.Javascript.JQuery as JQuery libraries = [("jquery.js", JQuery.file) ,("jquery.flot.js", Flot.file Flot.Flot) ,("jquery.flot.stack.js", Flot.file Flot.FlotStack) ] -- | Template Engine. Perform the following replacements on a line basis: -- -- * ==> -- -- * ==> runTemplate :: (Functor m, MonadIO m) => (FilePath -> m LBS.ByteString) -> LBS.ByteString -> m LBS.ByteString -- Functor constraint is required for GHC 7.8 and before runTemplate ask = fmap LBS.unlines . mapM f . LBS.lines where link = LBS.pack "\n" `LBS.append` res `LBS.append` LBS.pack "\n" | Just file <- lbsStripPrefix link y = do res <- grab file; return $ LBS.pack "" | otherwise = return x where y = LBS.dropWhile isSpace x grab = asker . takeWhile (/= '\"') . LBS.unpack asker o@(splitFileName -> ("lib/",x)) = case lookup x libraries of Just act -> liftIO $ LBS.readFile =<< act Nothing -> liftIO $ errorIO $ "Template library, unknown library: " ++ o asker x = ask x --------------------------------------------------------------------- -- COMPATIBILITY -- available in bytestring-0.10.8.0, GHC 8.0 and above -- alternative implementation below lbsStripPrefix :: LBS.ByteString -> LBS.ByteString -> Maybe LBS.ByteString lbsStripPrefix prefix text = if a == prefix then Just b else Nothing where (a,b) = LBS.splitAt (LBS.length prefix) text shake-0.16.4/src/General/Process.hs0000644000000000000000000002135613261223301015201 0ustar0000000000000000{-# LANGUAGE RecordWildCards #-} -- | A wrapping of createProcess to provide a more flexible interface. module General.Process( Buffer, newBuffer, readBuffer, process, ProcessOpts(..), Source(..), Destination(..) ) where import Control.Applicative import Control.Concurrent import Control.DeepSeq import Control.Exception.Extra as C import Control.Monad.Extra import Data.List.Extra import Data.Maybe import Foreign.C.Error import System.Exit import System.IO.Extra import System.Info.Extra import System.Process import System.Time.Extra import Data.Unique import Data.IORef import qualified Data.ByteString.Internal as BS(createAndTrim) import qualified Data.ByteString.Char8 as BS import qualified Data.ByteString.Lazy as LBS import General.Extra import Prelude import GHC.IO.Exception (IOErrorType(..), IOException(..)) --------------------------------------------------------------------- -- BUFFER ABSTRACTION data Buffer a = Buffer Unique (IORef [a]) instance Eq (Buffer a) where Buffer x _ == Buffer y _ = x == y instance Ord (Buffer a) where compare (Buffer x _) (Buffer y _) = compare x y newBuffer :: IO (Buffer a) newBuffer = liftM2 Buffer newUnique (newIORef []) addBuffer :: Buffer a -> a -> IO () addBuffer (Buffer _ ref) x = atomicModifyIORef ref $ \xs -> (x:xs, ()) readBuffer :: Buffer a -> IO [a] readBuffer (Buffer _ ref) = reverse <$> readIORef ref --------------------------------------------------------------------- -- OPTIONS data Source = SrcFile FilePath | SrcString String | SrcBytes LBS.ByteString data Destination = DestEcho | DestFile FilePath | DestString (Buffer String) | DestBytes (Buffer BS.ByteString) deriving (Eq,Ord) isDestString DestString{} = True; isDestString _ = False isDestBytes DestBytes{} = True; isDestBytes _ = False data ProcessOpts = ProcessOpts {poCommand :: CmdSpec ,poCwd :: Maybe FilePath ,poEnv :: Maybe [(String, String)] ,poTimeout :: Maybe Double ,poStdin :: [Source] ,poStdout :: [Destination] ,poStderr :: [Destination] ,poAsync :: Bool } --------------------------------------------------------------------- -- IMPLEMENTATION -- | If two buffers can be replaced by one and a copy, do that (only if they start empty) optimiseBuffers :: ProcessOpts -> IO (ProcessOpts, IO ()) optimiseBuffers po@ProcessOpts{..} = return (po{poStdout = nubOrd poStdout, poStderr = nubOrd poStderr}, return ()) stdStream :: (FilePath -> Handle) -> [Destination] -> [Destination] -> StdStream stdStream file [DestEcho] other = Inherit stdStream file [DestFile x] other | other == [DestFile x] || DestFile x `notElem` other = UseHandle $ file x stdStream file _ _ = CreatePipe stdIn :: (FilePath -> Handle) -> [Source] -> (StdStream, Handle -> IO ()) stdIn file [] = (Inherit, const $ return ()) stdIn file [SrcFile x] = (UseHandle $ file x, const $ return ()) stdIn file src = (,) CreatePipe $ \h -> ignoreSigPipe $ do forM_ src $ \x -> case x of SrcString x -> hPutStr h x SrcBytes x -> LBS.hPutStr h x SrcFile x -> LBS.hPutStr h =<< LBS.hGetContents (file x) hClose h ignoreSigPipe :: IO () -> IO () ignoreSigPipe = handleIO $ \e -> case e of IOError {ioe_type=ResourceVanished, ioe_errno=Just ioe} | Errno ioe == ePIPE -> return () _ -> throwIO e withTimeout :: Maybe Double -> IO () -> IO a -> IO a withTimeout Nothing stop go = go withTimeout (Just s) stop go = bracket (forkIO $ sleep s >> stop) killThread $ const go cmdSpec :: CmdSpec -> CreateProcess cmdSpec (ShellCommand x) = shell x cmdSpec (RawCommand x xs) = proc x xs forkWait :: IO a -> IO (IO a) forkWait a = do res <- newEmptyMVar _ <- mask $ \restore -> forkIO $ try_ (restore a) >>= putMVar res return $ takeMVar res >>= either throwIO return abort :: ProcessHandle -> IO () abort pid = do interruptProcessGroupOf pid sleep 5 -- give the process a few seconds grace period to die nicely -- seems to happen with some GHC 7.2 compiled binaries with FFI etc terminateProcess pid withFiles :: IOMode -> [FilePath] -> ((FilePath -> Handle) -> IO a) -> IO a withFiles mode files act = withs (map (`withFile` mode) files) $ \handles -> act $ \x -> fromJust $ lookup x $ zip files handles -- General approach taken from readProcessWithExitCode process :: ProcessOpts -> IO (ProcessHandle, ExitCode) process po = do (ProcessOpts{..}, flushBuffers) <- optimiseBuffers po let outFiles = nubOrd [x | DestFile x <- poStdout ++ poStderr] let inFiles = nubOrd [x | SrcFile x <- poStdin] withFiles WriteMode outFiles $ \outHandle -> withFiles ReadMode inFiles $ \inHandle -> do let cp = (cmdSpec poCommand){cwd = poCwd, env = poEnv, create_group = isJust poTimeout, close_fds = True ,std_in = fst $ stdIn inHandle poStdin ,std_out = stdStream outHandle poStdout poStderr, std_err = stdStream outHandle poStderr poStdout} withCreateProcessCompat cp $ \inh outh errh pid -> withTimeout poTimeout (abort pid) $ do let streams = [(outh, stdout, poStdout) | Just outh <- [outh], CreatePipe <- [std_out cp]] ++ [(errh, stderr, poStderr) | Just errh <- [errh], CreatePipe <- [std_err cp]] wait <- forM streams $ \(h, hh, dest) -> do -- no point tying the streams together if one is being streamed directly let isTied = not (poStdout `disjoint` poStderr) && length streams == 2 let isBinary = not $ any isDestString dest && not (any isDestBytes dest) when isTied $ hSetBuffering h LineBuffering when (DestEcho `elem` dest) $ do buf <- hGetBuffering hh case buf of BlockBuffering{} -> return () _ -> hSetBuffering h buf if isBinary then do hSetBinaryMode h True dest <- return $ for dest $ \d -> case d of DestEcho -> BS.hPut hh DestFile x -> BS.hPut (outHandle x) DestString x -> addBuffer x . (if isWindows then replace "\r\n" "\n" else id) . BS.unpack DestBytes x -> addBuffer x forkWait $ whileM $ do src <- bsHGetSome h 4096 mapM_ ($ src) dest notM $ hIsEOF h else if isTied then do dest <- return $ for dest $ \d -> case d of DestEcho -> hPutStrLn hh DestFile x -> hPutStrLn (outHandle x) DestString x -> addBuffer x . (++ "\n") forkWait $ whileM $ ifM (hIsEOF h) (return False) $ do src <- hGetLine h mapM_ ($ src) dest return True else do src <- hGetContents h wait1 <- forkWait $ C.evaluate $ rnf src waits <- forM dest $ \d -> case d of DestEcho -> forkWait $ hPutStr hh src DestFile x -> forkWait $ hPutStr (outHandle x) src DestString x -> do addBuffer x src; return $ return () return $ sequence_ $ wait1 : waits whenJust inh $ snd $ stdIn inHandle poStdin if poAsync then return (pid, ExitSuccess) else do sequence_ wait flushBuffers res <- waitForProcess pid whenJust outh hClose whenJust errh hClose return (pid, res) --------------------------------------------------------------------- -- COMPATIBILITY -- available in bytestring-0.9.1.10, GHC 7.8 and above -- implementation copied below bsHGetSome :: Handle -> Int -> IO BS.ByteString bsHGetSome h i = BS.createAndTrim i $ \p -> hGetBufSome h p i -- available in process-1.4.3.0, GHC ??? (Nov 2015) -- logic copied directly (apart from Ctrl-C handling magic using internal pieces) withCreateProcessCompat :: CreateProcess -> (Maybe Handle -> Maybe Handle -> Maybe Handle -> ProcessHandle -> IO a) -> IO a withCreateProcessCompat cp act = bracketOnError (createProcess cp) cleanup (\(m_in, m_out, m_err, ph) -> act m_in m_out m_err ph) where cleanup (inh, outh, errh, pid) = do terminateProcess pid whenJust inh $ ignoreSigPipe . hClose whenJust outh hClose whenJust errh hClose forkIO $ void $ waitForProcess pid shake-0.16.4/src/General/Makefile.hs0000644000000000000000000000254013261223301015272 0ustar0000000000000000 module General.Makefile(parseMakefile) where import qualified Data.ByteString.Char8 as BS import Data.Char endsSlash :: BS.ByteString -> Bool endsSlash = BS.isSuffixOf (BS.singleton '\\') wordsMakefile :: BS.ByteString -> [BS.ByteString] wordsMakefile = f . BS.splitWith isSpace where f (x:xs) | BS.null x = f xs f (x:y:xs) | endsSlash x = f $ BS.concat [BS.init x, BS.singleton ' ', y] : xs f (x:xs) = x : f xs f [] = [] parseMakefile :: BS.ByteString -> [(BS.ByteString, [BS.ByteString])] parseMakefile = concatMap f . join . linesCR where join xs = case span endsSlash xs of ([], []) -> [] (xs, []) -> [BS.unwords $ map BS.init xs] ([], y:ys) -> y : join ys (xs, y:ys) -> BS.unwords (map BS.init xs ++ [y]) : join ys f x = [(a, wordsMakefile $ BS.drop 1 b) | a <- wordsMakefile a] where (a,b) = BS.break (== ':') $ BS.takeWhile (/= '#') x -- | This is a hot-spot, so optimised linesCR :: BS.ByteString -> [BS.ByteString] linesCR x = case BS.split '\n' x of x:xs | Just ('\r',x) <- unsnoc x -> x : map (\x -> case unsnoc x of Just ('\r',x) -> x; _ -> x) xs xs -> xs where -- the ByteString unsnoc was introduced in a newer version unsnoc x | BS.null x = Nothing | otherwise = Just (BS.last x, BS.init x) shake-0.16.4/src/General/ListBuilder.hs0000644000000000000000000000123713261223301016001 0ustar0000000000000000 module General.ListBuilder( ListBuilder, runListBuilder, newListBuilder ) where import Data.Semigroup (Semigroup (..)) import Data.Monoid hiding ((<>)) import Prelude() data ListBuilder a = Zero | One a | Add (ListBuilder a) (ListBuilder a) instance Semigroup (ListBuilder a) where Zero <> x = x x <> Zero = x x <> y = Add x y instance Monoid (ListBuilder a) where mempty = Zero mappend = (<>) newListBuilder :: a -> ListBuilder a newListBuilder = One runListBuilder :: ListBuilder a -> [a] runListBuilder x = f x [] where f Zero acc = [] f (One x) acc = x : acc f (Add x y) acc = f x (f y acc) shake-0.16.4/src/General/Intern.hs0000644000000000000000000000222313261223301015012 0ustar0000000000000000{-# LANGUAGE GeneralizedNewtypeDeriving #-} module General.Intern( Intern, Id(..), empty, insert, add, lookup, toList, fromList ) where import Development.Shake.Classes import Foreign.Storable import Data.Word import Prelude hiding (lookup) import qualified Data.HashMap.Strict as Map import Data.List(foldl') -- Invariant: The first field is the highest value in the Map data Intern a = Intern {-# UNPACK #-} !Word32 !(Map.HashMap a Id) newtype Id = Id Word32 deriving (Eq,Hashable,Binary,Show,NFData,Storable) empty :: Intern a empty = Intern 0 Map.empty insert :: (Eq a, Hashable a) => a -> Id -> Intern a -> Intern a insert k v@(Id i) (Intern n mp) = Intern (max n i) $ Map.insert k v mp add :: (Eq a, Hashable a) => a -> Intern a -> (Intern a, Id) add k (Intern v mp) = (Intern v2 $ Map.insert k (Id v2) mp, Id v2) where v2 = v + 1 lookup :: (Eq a, Hashable a) => a -> Intern a -> Maybe Id lookup k (Intern n mp) = Map.lookup k mp toList :: Intern a -> [(a, Id)] toList (Intern a b) = Map.toList b fromList :: (Eq a, Hashable a) => [(a, Id)] -> Intern a fromList xs = Intern (foldl' max 0 [i | (_, Id i) <- xs]) (Map.fromList xs) shake-0.16.4/src/General/Ids.hs0000644000000000000000000000710213261223301014273 0ustar0000000000000000{-# LANGUAGE RecordWildCards, BangPatterns, GADTs, UnboxedTuples #-} -- Note that argument order is more like IORef than Map, because its mutable module General.Ids( Ids, Id, empty, insert, lookup, null, size, sizeUpperBound, forWithKeyM_, for, toList, toMap ) where import Data.IORef.Extra import Data.Primitive.Array import Control.Exception import General.Intern(Id(..)) import Control.Monad.Extra import Data.Maybe import Data.Functor import qualified Data.HashMap.Strict as Map import Prelude hiding (lookup, null) import GHC.IO(IO(..)) import GHC.Exts(RealWorld) newtype Ids a = Ids (IORef (S a)) data S a = S {capacity :: {-# UNPACK #-} !Int -- ^ Number of entries in values, initially 0 ,used :: {-# UNPACK #-} !Int -- ^ Capacity that has been used, assuming no gaps from index 0, initially 0 ,values :: {-# UNPACK #-} !(MutableArray RealWorld (Maybe a)) } empty :: IO (Ids a) empty = do let capacity = 0 let used = 0 values <- newArray capacity Nothing Ids <$> newIORef S{..} sizeUpperBound :: Ids a -> IO Int sizeUpperBound (Ids ref) = do S{..} <- readIORef ref return used size :: Ids a -> IO Int size (Ids ref) = do S{..} <- readIORef ref let go !acc i | i < 0 = return acc | otherwise = do v <- readArray values i if isJust v then go (acc+1) (i-1) else go acc (i-1) go 0 (used-1) toMap :: Ids a -> IO (Map.HashMap Id a) toMap ids = do mp <- Map.fromList <$> toListUnsafe ids return $! mp forWithKeyM_ :: Ids a -> (Id -> a -> IO ()) -> IO () forWithKeyM_ (Ids ref) f = do S{..} <- readIORef ref let go !i | i >= used = return () | otherwise = do v <- readArray values i whenJust v $ f $ Id $ fromIntegral i go $ i+1 go 0 for :: Ids a -> (a -> b) -> IO (Ids b) for (Ids ref) f = do S{..} <- readIORef ref values2 <- newArray capacity Nothing let go !i | i >= used = return () | otherwise = do v <- readArray values i whenJust v $ \v -> writeArray values2 i $ Just $ f v go $ i+1 go 0 Ids <$> newIORef (S capacity used values2) toListUnsafe :: Ids a -> IO [(Id, a)] toListUnsafe (Ids ref) = do S{..} <- readIORef ref -- execute in O(1) stack -- see https://neilmitchell.blogspot.co.uk/2015/09/making-sequencemapm-for-io-take-o1-stack.html let index r i | i >= used = [] index r i | IO io <- readArray values i = case io r of (# r, Nothing #) -> index r (i+1) (# r, Just v #) -> (Id $ fromIntegral i, v) : index r (i+1) IO $ \r -> (# r, index r 0 #) toList :: Ids a -> IO [(Id, a)] toList ids = do xs <- toListUnsafe ids let demand (x:xs) = demand xs demand [] = () evaluate $ demand xs return xs null :: Ids a -> IO Bool null ids = (== 0) <$> sizeUpperBound ids insert :: Ids a -> Id -> a -> IO () insert (Ids ref) (Id i) v = do S{..} <- readIORef ref let ii = fromIntegral i if ii < capacity then do writeArray values ii $ Just v when (ii >= used) $ writeIORef' ref S{used=ii+1,..} else do c2 <- return $ max (capacity * 2) (ii + 10000) v2 <- newArray c2 Nothing copyMutableArray v2 0 values 0 capacity writeArray v2 ii $ Just v writeIORef' ref $ S c2 (ii+1) v2 lookup :: Ids a -> Id -> IO (Maybe a) lookup (Ids ref) (Id i) = do S{..} <- readIORef ref let ii = fromIntegral i if ii < used then readArray values ii else return Nothing shake-0.16.4/src/General/GetOpt.hs0000644000000000000000000000526513261223301014766 0ustar0000000000000000 module General.GetOpt( OptDescr(..), ArgDescr(..), getOpt, fmapOptDescr, showOptDescr, mergeOptDescr, removeOverlap, optionsEnum, optionsEnumDesc ) where import qualified System.Console.GetOpt as O import System.Console.GetOpt hiding (getOpt) import qualified Data.HashSet as Set import Data.Maybe import Data.Either import Data.List.Extra getOpt :: [OptDescr (Either String a)] -> [String] -> ([a], [String], [String]) getOpt opts args = (flagGood, files, flagBad ++ errs) where (flags, files, errs) = O.getOpt O.Permute opts args (flagBad, flagGood) = partitionEithers flags -- fmap is only an instance in later GHC 7.8 and above, so fake our own version fmapOptDescr :: (a -> b) -> OptDescr (Either String a) -> OptDescr (Either String b) fmapOptDescr f (Option a b c d) = Option a b (g c) d where g (NoArg a) = NoArg $ fmap f a g (ReqArg a b) = ReqArg (fmap f . a) b g (OptArg a b) = OptArg (fmap f . a) b showOptDescr :: [OptDescr a] -> [String] showOptDescr xs = concat [ if nargs <= 26 then [" " ++ args ++ replicate (28 - nargs) ' ' ++ desc] else [" " ++ args, replicate 30 ' ' ++ desc] | Option s l arg desc <- xs , let args = intercalate ", " $ map (short arg) s ++ map (long arg) l , let nargs = length args] where short NoArg{} x = "-" ++ [x] short (ReqArg _ b) x = "-" ++ [x] ++ " " ++ b short (OptArg _ b) x = "-" ++ [x] ++ "[" ++ b ++ "]" long NoArg{} x = "--" ++ x long (ReqArg _ b) x = "--" ++ x ++ "=" ++ b long (OptArg _ b) x = "--" ++ x ++ "[=" ++ b ++ "]" -- | Remove flags from the first field that are present in the second removeOverlap :: [OptDescr b] -> [OptDescr a] -> [OptDescr a] removeOverlap bad = mapMaybe f where short = Set.fromList $ concat [x | Option x _ _ _ <- bad] long = Set.fromList $ concat [x | Option _ x _ _ <- bad] f (Option a b c d) | null a2 && null b2 = Nothing | otherwise = Just $ Option a2 b2 c d where a2 = filter (not . flip Set.member short) a b2 = filter (not . flip Set.member long) b mergeOptDescr :: [OptDescr (Either String a)] -> [OptDescr (Either String b)] -> [OptDescr (Either String (Either a b))] mergeOptDescr xs ys = map (fmapOptDescr Left) xs ++ map (fmapOptDescr Right) ys optionsEnum :: (Enum a, Bounded a, Show a) => [OptDescr (Either String a)] optionsEnum = optionsEnumDesc [(x, "Flag " ++ lower (show x) ++ ".") | x <- [minBound..maxBound]] optionsEnumDesc :: Show a => [(a, String)] -> [OptDescr (Either String a)] optionsEnumDesc xs = [Option "" [lower $ show x] (NoArg $ Right x) d | (x,d) <- xs] shake-0.16.4/src/General/FileLock.hs0000644000000000000000000000474613261223301015257 0ustar0000000000000000{-# LANGUAGE CPP #-} module General.FileLock(withLockFile) where import Control.Exception.Extra import System.FilePath import General.Extra #ifdef mingw32_HOST_OS import Data.Bits import Data.Word import Foreign.Ptr import Foreign.C.Types import Foreign.C.String #else import System.IO import System.Posix.IO #endif #ifdef mingw32_HOST_OS #ifdef x86_64_HOST_ARCH #define CALLCONV ccall #else #define CALLCONV stdcall #endif foreign import CALLCONV unsafe "Windows.h CreateFileW" c_CreateFileW :: Ptr CWchar -> Word32 -> Word32 -> Ptr () -> Word32 -> Word32 -> Ptr () -> IO (Ptr ()) foreign import CALLCONV unsafe "Windows.h CloseHandle" c_CloseHandle :: Ptr () -> IO Bool foreign import CALLCONV unsafe "Windows.h GetLastError" c_GetLastError :: IO Word32 c_GENERIC_WRITE = 0x40000000 :: Word32 c_GENERIC_READ = 0x80000000 :: Word32 c_FILE_SHARE_NONE = 0 :: Word32 c_OPEN_ALWAYS = 4 :: Word32 c_FILE_ATTRIBUTE_NORMAL = 0x80 :: Word32 c_INVALID_HANDLE_VALUE = intPtrToPtr (-1) c_ERROR_SHARING_VIOLATION = 32 #endif withLockFile :: FilePath -> IO a -> IO a #ifdef mingw32_HOST_OS withLockFile file act = withCWString file $ \cfile -> do createDirectoryRecursive $ takeDirectory file let open = c_CreateFileW cfile (c_GENERIC_READ .|. c_GENERIC_WRITE) c_FILE_SHARE_NONE nullPtr c_OPEN_ALWAYS c_FILE_ATTRIBUTE_NORMAL nullPtr bracket open c_CloseHandle $ \h -> if h == c_INVALID_HANDLE_VALUE then do err <- c_GetLastError errorIO $ "Shake failed to acquire a file lock on " ++ file ++ "\n" ++ (if err == c_ERROR_SHARING_VIOLATION then "ERROR_SHARING_VIOLATION - Shake is probably already running." else "Code " ++ show err ++ ", unknown reason for failure.") else act #else withLockFile file act = do createDirectoryRecursive $ takeDirectory file tryIO $ writeFile file "" bracket (openFd file ReadWrite Nothing defaultFileFlags) closeFd $ \fd -> do let lock = (WriteLock, AbsoluteSeek, 0, 0) res <- tryIO $ setLock fd lock case res of Right () -> act Left e -> do res <- getLock fd lock errorIO $ "Shake failed to acquire a file lock on " ++ file ++ "\n" ++ (case res of Nothing -> "" Just (pid, _) -> "Shake process ID " ++ show pid ++ " is using this lock.\n") ++ show e #endif shake-0.16.4/src/General/Extra.hs0000644000000000000000000001422413261223301014642 0ustar0000000000000000{-# LANGUAGE ScopedTypeVariables #-} module General.Extra( getProcessorCount, findGcc, withResultType, whenLeft, randomElem, wrapQuote, showBracket, withs, maximum', maximumBy', fastAt, forkFinallyUnmasked, isAsyncException, doesFileExist_, removeFile_, createDirectoryRecursive, catchIO, tryIO, handleIO ) where import Control.Exception import Data.Char import Data.List import System.Environment.Extra import System.IO.Extra import System.IO.Unsafe import System.Info.Extra import System.FilePath import System.Random import System.Directory import System.Exit import Control.Concurrent import Data.Maybe import Data.Functor import Data.Primitive.Array import Control.Applicative import Control.Monad import Control.Monad.ST import GHC.Conc(getNumProcessors) import Prelude --------------------------------------------------------------------- -- Prelude -- See https://ghc.haskell.org/trac/ghc/ticket/10830 - they broke maximumBy maximumBy' :: (a -> a -> Ordering) -> [a] -> a maximumBy' cmp = foldl1' $ \x y -> if cmp x y == GT then x else y maximum' :: Ord a => [a] -> a maximum' = maximumBy' compare --------------------------------------------------------------------- -- Data.List -- | If a string has any spaces then put quotes around and double up all internal quotes. -- Roughly the inverse of Windows command line parsing. wrapQuote :: String -> String wrapQuote xs | any isSpace xs = "\"" ++ concatMap (\x -> if x == '\"' then "\"\"" else [x]) xs ++ "\"" | otherwise = xs -- | If a string has any spaces then put brackets around it. wrapBracket :: String -> String wrapBracket xs | any isSpace xs = "(" ++ xs ++ ")" | otherwise = xs -- | Alias for @wrapBracket . show@. showBracket :: Show a => a -> String showBracket = wrapBracket . show -- | Version of '!!' which is fast and returns 'Nothing' if the index is not present. fastAt :: [a] -> (Int -> Maybe a) fastAt xs = \i -> if i < 0 || i >= n then Nothing else Just $ indexArray arr i where n = length xs arr = runST $ do let n = length xs arr <- newArray n undefined forM_ (zip [0..] xs) $ \(i,x) -> writeArray arr i x unsafeFreezeArray arr --------------------------------------------------------------------- -- System.Info {-# NOINLINE getProcessorCount #-} getProcessorCount :: IO Int -- unsafePefromIO so we cache the result and only compute it once getProcessorCount = let res = unsafePerformIO act in return res where act = if rtsSupportsBoundThreads then fromIntegral <$> getNumProcessors else do env <- lookupEnv "NUMBER_OF_PROCESSORS" case env of Just s | [(i,"")] <- reads s -> return i _ -> do src <- readFile' "/proc/cpuinfo" `catchIO` \_ -> return "" return $! max 1 $ length [() | x <- lines src, "processor" `isPrefixOf` x] -- Can you find a GCC executable? return a Bool, and optionally something to add to $PATH to run it findGcc :: IO (Bool, Maybe FilePath) findGcc = do v <- findExecutable "gcc" case v of Nothing | isWindows -> do ghc <- findExecutable "ghc" case ghc of Just ghc -> do let gcc = takeDirectory (takeDirectory ghc) "mingw/bin/gcc.exe" b <- doesFileExist_ gcc return $ if b then (True, Just $ takeDirectory gcc) else (False, Nothing) _ -> return (False, Nothing) _ -> return (isJust v, Nothing) --------------------------------------------------------------------- -- System.Random randomElem :: [a] -> IO a randomElem xs = do when (null xs) $ fail "General.Extra.randomElem called with empty list, can't pick a random element" i <- randomRIO (0, length xs - 1) return $ xs !! i --------------------------------------------------------------------- -- Control.Monad withs :: [(a -> r) -> r] -> ([a] -> r) -> r withs [] act = act [] withs (f:fs) act = f $ \a -> withs fs $ \as -> act $ a:as --------------------------------------------------------------------- -- Control.Concurrent -- | Like 'forkFinally', but the inner thread is unmasked even if you started masked. forkFinallyUnmasked :: IO a -> (Either SomeException a -> IO ()) -> IO ThreadId forkFinallyUnmasked act cleanup = mask_ $ forkIOWithUnmask $ \unmask -> try (unmask act) >>= cleanup --------------------------------------------------------------------- -- Control.Exception -- | Is the exception asynchronous, not a "coding error" that should be ignored isAsyncException :: SomeException -> Bool isAsyncException e | Just (_ :: AsyncException) <- fromException e = True | Just (_ :: ExitCode) <- fromException e = True | otherwise = False catchIO :: IO a -> (IOException -> IO a) -> IO a catchIO = Control.Exception.catch -- GHC 7.4 has catch in the Prelude as well tryIO :: IO a -> IO (Either IOException a) tryIO = try handleIO :: (IOException -> IO a) -> IO a -> IO a handleIO = flip catchIO --------------------------------------------------------------------- -- System.Directory doesFileExist_ :: FilePath -> IO Bool doesFileExist_ x = doesFileExist x `catchIO` \_ -> return False -- | Remove a file, but don't worry if it fails removeFile_ :: FilePath -> IO () removeFile_ x = removeFile x `catchIO` \_ -> return () -- | Like @createDirectoryIfMissing True@ but faster, as it avoids -- any work in the common case the directory already exists. createDirectoryRecursive :: FilePath -> IO () createDirectoryRecursive dir = do x <- tryIO $ doesDirectoryExist dir when (x /= Right True) $ createDirectoryIfMissing True dir --------------------------------------------------------------------- -- Data.Either whenLeft :: Applicative m => Either a b -> (a -> m ()) -> m () whenLeft x f = either f (const $ pure ()) x --------------------------------------------------------------------- -- Data.Proxy -- Should be Proxy, but that's not available in older GHC 7.6 and before withResultType :: (Maybe a -> a) -> a withResultType f = f Nothing shake-0.16.4/src/General/Concurrent.hs0000644000000000000000000000201313261223301015672 0ustar0000000000000000 module General.Concurrent( Fence, newFence, signalFence, waitFence, testFence, ) where import Control.Applicative import Control.Monad import Data.IORef import Prelude --------------------------------------------------------------------- -- FENCE -- | Like a barrier, but based on callbacks newtype Fence a = Fence (IORef (Either [a -> IO ()] a)) instance Show (Fence a) where show _ = "Fence" newFence :: IO (Fence a) newFence = Fence <$> newIORef (Left []) signalFence :: Fence a -> a -> IO () signalFence (Fence ref) v = join $ atomicModifyIORef ref $ \x -> case x of Left queue -> (Right v, mapM_ ($ v) $ reverse queue) Right v -> error "Shake internal error, signalFence called twice on one Fence" waitFence :: Fence a -> (a -> IO ()) -> IO () waitFence (Fence ref) call = join $ atomicModifyIORef ref $ \x -> case x of Left queue -> (Left (call:queue), return ()) Right v -> (Right v, call v) testFence :: Fence a -> IO (Maybe a) testFence (Fence x) = either (const Nothing) Just <$> readIORef x shake-0.16.4/src/General/Cleanup.hs0000644000000000000000000000325413261223301015147 0ustar0000000000000000 -- | Code for ensuring cleanup actions are run. module General.Cleanup( Cleanup, withCleanup, addCleanup, addCleanup_ ) where import Control.Exception import qualified Data.HashMap.Strict as Map import Control.Monad import Data.IORef.Extra import Data.List.Extra data S = S {unique :: {-# UNPACK #-} !Int, items :: !(Map.HashMap Int (IO ()))} newtype Cleanup = Cleanup (IORef S) -- | Run with some cleanup scope. Regardless of exceptions/threads, all 'addCleanup' actions -- will be run by the time it exits. The 'addCleanup' actions will be run in reverse order. withCleanup :: (Cleanup -> IO a) -> IO a withCleanup act = do ref <- newIORef $ S 0 Map.empty act (Cleanup ref) `finally` runCleanup (Cleanup ref) -- | Run all the cleanup actions immediately. Done automatically by withCleanup runCleanup :: Cleanup -> IO () runCleanup (Cleanup ref) = do items <- atomicModifyIORef' ref $ \s -> (s{items=Map.empty}, items s) mapM_ snd $ sortOn (negate . fst) $ Map.toList items -- | Add a cleanup action to a 'Cleanup' scope, returning a way to remove (but not perform) that action. -- If not removed by the time 'withCleanup' terminates then the cleanup action will be run then. addCleanup :: Cleanup -> IO () -> IO (IO ()) addCleanup (Cleanup ref) act = atomicModifyIORef' ref $ \s -> let i = unique s in (,) (S (unique s + 1) (Map.insert i act $ items s)) $ atomicModifyIORef' ref $ \s -> (s{items = Map.delete i $ items s}, ()) addCleanup_ :: Cleanup -> IO () -> IO () -- we could avoid inserting into the Map, but we need to store the pairs anyway -- to unregister them in order, so might as well keep it simple addCleanup_ c act = void $ addCleanup c act shake-0.16.4/src/General/Chunks.hs0000644000000000000000000001202613261223301015010 0ustar0000000000000000{-# LANGUAGE RecordWildCards, ScopedTypeVariables #-} module General.Chunks( Chunks, readChunk, readChunkMax, writeChunks, writeChunk, restoreChunksBackup, withChunks, resetChunksCompact, resetChunksCorrupt ) where import System.Time.Extra import System.FilePath import Control.Concurrent.Extra import Control.Monad.Extra import Control.Exception import System.IO import System.Directory import qualified Data.ByteString as BS import Data.Word import Data.Monoid import General.Binary import General.Extra import Prelude data Chunks = Chunks {chunksFileName :: FilePath ,chunksFlush :: Maybe Seconds ,chunksHandle :: MVar Handle } --------------------------------------------------------------------- -- READ/WRITE OPERATIONS readChunk :: Chunks -> IO (Either BS.ByteString BS.ByteString) readChunk c = readChunkMax c maxBound -- | Return either a valid chunk (Right), or a trailing suffix with no information (Left) readChunkMax :: Chunks -> Word32 -> IO (Either BS.ByteString BS.ByteString) readChunkMax Chunks{..} mx = withMVar chunksHandle $ \h -> do let slop x = do unless (BS.null x) $ hSetFileSize h . subtract (toInteger $ BS.length x) =<< hFileSize h return $ Left x n <- BS.hGet h 4 if BS.length n < 4 then slop n else do let count = fromIntegral $ min mx $ fst $ unsafeBinarySplit n v <- BS.hGet h count if BS.length v < count then slop (n `BS.append` v) else return $ Right v writeChunkDirect :: Handle -> Builder -> IO () writeChunkDirect h x = bs `seq` BS.hPut h bs where bs = runBuilder $ putEx (fromIntegral $ sizeBuilder x :: Word32) <> x -- | If 'writeChunks' and any of the reopen operations are interleaved it will cause issues. writeChunks :: Chunks -> ((Builder -> IO ()) -> IO a) -> IO a -- We avoid calling flush too often on SSD drives, as that can be slow -- Make sure all exceptions happen on the caller, so we don't have to move exceptions back -- Make sure we only write on one thread, otherwise async exceptions can cause partial writes writeChunks Chunks{..} act = withMVar chunksHandle $ \h -> do chan <- newChan -- operations to perform on the file kick <- newEmptyMVar -- kicked whenever something is written died <- newBarrier -- has the writing thread finished flusher <- case chunksFlush of Nothing -> return Nothing Just flush -> fmap Just $ forkIO $ forever $ do takeMVar kick threadDelay $ ceiling $ flush * 1000000 tryTakeMVar kick writeChan chan $ hFlush h >> return True root <- myThreadId writer <- flip forkFinally (\e -> do signalBarrier died (); whenLeft e (throwTo root)) $ -- only one thread ever writes, ensuring only the final write can be torn whileM $ join $ readChan chan (act $ \s -> do out <- evaluate $ writeChunkDirect h s -- ensure exceptions occur on this thread writeChan chan $ out >> tryPutMVar kick () >> return True) `finally` do maybe (return ()) killThread flusher writeChan chan $ return False waitBarrier died writeChunk :: Chunks -> Builder -> IO () writeChunk Chunks{..} x = withMVar chunksHandle $ \h -> writeChunkDirect h x --------------------------------------------------------------------- -- FILENAME OPERATIONS backup x = x <.> "backup" restoreChunksBackup :: FilePath -> IO Bool restoreChunksBackup file = do -- complete a partially failed compress b <- doesFileExist $ backup file if not b then return False else do removeFile_ file renameFile (backup file) file return True withChunks :: FilePath -> Maybe Seconds -> (Chunks -> IO a) -> IO a withChunks file flush act = do h <- newEmptyMVar bracket_ (putMVar h =<< openFile file ReadWriteMode) (hClose =<< takeMVar h) $ act $ Chunks file flush h -- | The file is being compacted, if the process fails, use a backup. resetChunksCompact :: Chunks -> ((Builder -> IO ()) -> IO a) -> IO a resetChunksCompact Chunks{..} act = mask $ \restore -> do h <- takeMVar chunksHandle flip onException (putMVar chunksHandle h) $ restore $ do hClose h copyFile chunksFileName $ backup chunksFileName h <- openFile chunksFileName ReadWriteMode flip finally (putMVar chunksHandle h) $ restore $ do hSetFileSize h 0 hSeek h AbsoluteSeek 0 res <- act $ writeChunkDirect h hFlush h removeFile $ backup chunksFileName return res -- | The file got corrupted, return a new version. resetChunksCorrupt :: Maybe FilePath -> Chunks -> IO () resetChunksCorrupt copy Chunks{..} = mask $ \restore -> do h <- takeMVar chunksHandle case copy of Nothing -> return h Just copy -> do flip onException (putMVar chunksHandle h) $ restore $ do hClose h copyFile chunksFileName copy openFile chunksFileName ReadWriteMode flip finally (putMVar chunksHandle h) $ do hSetFileSize h 0 hSeek h AbsoluteSeek 0 shake-0.16.4/src/General/Binary.hs0000644000000000000000000001766113261223301015013 0ustar0000000000000000{-# LANGUAGE FlexibleInstances, ExplicitForAll, ScopedTypeVariables, Rank2Types #-} module General.Binary( BinaryOp(..), binarySplit, binarySplit2, binarySplit3, unsafeBinarySplit, Builder(..), runBuilder, sizeBuilder, BinaryEx(..), putExStorable, getExStorable, putExStorableList, getExStorableList, putExList, getExList, putExN, getExN ) where import Control.Monad import Data.Binary import Data.List.Extra import Data.Tuple.Extra import Foreign.Storable import Foreign.Ptr import System.IO.Unsafe as U import qualified Data.ByteString as BS import qualified Data.ByteString.Internal as BS import qualified Data.ByteString.Unsafe as BS import qualified Data.ByteString.Lazy as LBS import qualified Data.ByteString.UTF8 as UTF8 import Data.Functor import Data.Semigroup (Semigroup (..)) import Data.Monoid hiding ((<>)) import Prelude --------------------------------------------------------------------- -- STORE TYPE -- | An explicit and more efficient version of Binary data BinaryOp v = BinaryOp {putOp :: v -> Builder ,getOp :: BS.ByteString -> v } binarySplit :: forall a . Storable a => BS.ByteString -> (a, BS.ByteString) binarySplit bs | BS.length bs < sizeOf (undefined :: a) = error "Reading from ByteString, insufficient left" | otherwise = unsafeBinarySplit bs binarySplit2 :: forall a b . (Storable a, Storable b) => BS.ByteString -> (a, b, BS.ByteString) binarySplit2 bs | BS.length bs < sizeOf (undefined :: a) + sizeOf (undefined :: b) = error "Reading from ByteString, insufficient left" | (a,bs) <- unsafeBinarySplit bs, (b,bs) <- unsafeBinarySplit bs = (a,b,bs) binarySplit3 :: forall a b c . (Storable a, Storable b, Storable c) => BS.ByteString -> (a, b, c, BS.ByteString) binarySplit3 bs | BS.length bs < sizeOf (undefined :: a) + sizeOf (undefined :: b) + sizeOf (undefined :: c) = error "Reading from ByteString, insufficient left" | (a,bs) <- unsafeBinarySplit bs, (b,bs) <- unsafeBinarySplit bs, (c,bs) <- unsafeBinarySplit bs = (a,b,c,bs) unsafeBinarySplit :: Storable a => BS.ByteString -> (a, BS.ByteString) unsafeBinarySplit bs = (v, BS.unsafeDrop (sizeOf v) bs) where v = unsafePerformIO $ BS.unsafeUseAsCString bs $ \ptr -> peek (castPtr ptr) -- forM for zipWith for2M_ as bs f = zipWithM_ f as bs --------------------------------------------------------------------- -- BINARY SERIALISATION -- We can't use the Data.ByteString builder as that doesn't track the size of the chunk. data Builder = Builder {-# UNPACK #-} !Int (forall a . Ptr a -> Int -> IO ()) sizeBuilder :: Builder -> Int sizeBuilder (Builder i _) = i runBuilder :: Builder -> BS.ByteString runBuilder (Builder i f) = unsafePerformIO $ BS.create i $ \ptr -> f ptr 0 instance Semigroup Builder where (Builder x1 x2) <> (Builder y1 y2) = Builder (x1+y1) $ \p i -> do x2 p i; y2 p $ i+x1 instance Monoid Builder where mempty = Builder 0 $ \_ _ -> return () mappend = (<>) -- | Methods for Binary serialisation that go directly between strict ByteString values. -- When the Database is read each key/value will be loaded as a separate ByteString, -- and for certain types (e.g. file rules) this may remain the preferred format for storing keys. -- Optimised for performance. class BinaryEx a where putEx :: a -> Builder getEx :: BS.ByteString -> a instance BinaryEx BS.ByteString where putEx x = Builder n $ \ptr i -> BS.useAsCString x $ \bs -> BS.memcpy (ptr `plusPtr` i) (castPtr bs) (fromIntegral n) where n = BS.length x getEx = id instance BinaryEx LBS.ByteString where putEx x = Builder (fromIntegral $ LBS.length x) $ \ptr i -> do let go i [] = return () go i (x:xs) = do let n = BS.length x BS.useAsCString x $ \bs -> BS.memcpy (ptr `plusPtr` i) (castPtr bs) (fromIntegral n) go (i+n) xs go i $ LBS.toChunks x getEx = LBS.fromChunks . return instance BinaryEx [BS.ByteString] where -- Format: -- n :: Word32 - number of strings -- ns :: [Word32]{n} - length of each string -- contents of each string concatenated (sum ns bytes) putEx xs = Builder (4 + (n * 4) + sum ns) $ \p i -> do pokeByteOff p i (fromIntegral n :: Word32) for2M_ [4+i,8+i..] ns $ \i x -> pokeByteOff p i (fromIntegral x :: Word32) p <- return $ p `plusPtr` (i + 4 + (n * 4)) for2M_ (scanl (+) 0 ns) xs $ \i x -> BS.useAsCStringLen x $ \(bs, n) -> BS.memcpy (p `plusPtr` i) (castPtr bs) (fromIntegral n) where ns = map BS.length xs n = length ns getEx bs = unsafePerformIO $ BS.useAsCString bs $ \p -> do n <- fromIntegral <$> (peekByteOff p 0 :: IO Word32) ns :: [Word32] <- forM [1..fromIntegral n] $ \i -> peekByteOff p (i * 4) return $ snd $ mapAccumL (\bs i -> swap $ BS.splitAt (fromIntegral i) bs) (BS.drop (4 + (n * 4)) bs) ns instance BinaryEx () where putEx () = mempty getEx _ = () instance BinaryEx String where putEx = putEx . UTF8.fromString getEx = UTF8.toString instance BinaryEx (Maybe String) where putEx Nothing = mempty putEx (Just xs) = putEx $ UTF8.fromString $ '\0' : xs getEx = fmap snd . uncons . UTF8.toString instance BinaryEx [String] where putEx = putEx . map UTF8.fromString getEx = map UTF8.toString . getEx instance BinaryEx (String, [String]) where putEx (a,bs) = putEx $ a:bs getEx x = let a:bs = getEx x in (a,bs) instance BinaryEx Bool where putEx False = Builder 1 $ \ptr i -> pokeByteOff ptr i (0 :: Word8) putEx True = mempty getEx = BS.null instance BinaryEx Word8 where putEx = putExStorable getEx = getExStorable instance BinaryEx Word16 where putEx = putExStorable getEx = getExStorable instance BinaryEx Word32 where putEx = putExStorable getEx = getExStorable instance BinaryEx Int where putEx = putExStorable getEx = getExStorable instance BinaryEx Float where putEx = putExStorable getEx = getExStorable putExStorable :: forall a . Storable a => a -> Builder putExStorable x = Builder (sizeOf x) $ \p i -> pokeByteOff p i x getExStorable :: forall a . Storable a => BS.ByteString -> a getExStorable = \bs -> unsafePerformIO $ BS.useAsCStringLen bs $ \(p, size) -> if size /= n then error "size mismatch" else peek (castPtr p) where n = sizeOf (undefined :: a) putExStorableList :: forall a . Storable a => [a] -> Builder putExStorableList xs = Builder (n * length xs) $ \ptr i -> for2M_ [i,i+n..] xs $ \i x -> pokeByteOff ptr i x where n = sizeOf (undefined :: a) getExStorableList :: forall a . Storable a => BS.ByteString -> [a] getExStorableList = \bs -> unsafePerformIO $ BS.useAsCStringLen bs $ \(p, size) -> let (d,m) = size `divMod` n in if m /= 0 then error "size mismatch" else forM [0..d-1] $ \i -> peekElemOff (castPtr p) i where n = sizeOf (undefined :: a) -- repeating: -- Word32, length of BS -- BS putExList :: [Builder] -> Builder putExList xs = Builder (sum $ map (\b -> sizeBuilder b + 4) xs) $ \p i -> do let go i [] = return () go i (Builder n b:xs) = do pokeByteOff p i (fromIntegral n :: Word32) b p (i+4) go (i+4+n) xs go i xs getExList :: BS.ByteString -> [BS.ByteString] getExList bs | len == 0 = [] | len >= 4 , (n :: Word32, bs) <- unsafeBinarySplit bs , n <- fromIntegral n , (len - 4) >= n = BS.unsafeTake n bs : getExList (BS.unsafeDrop n bs) | otherwise = error "getList, corrupted binary" where len = BS.length bs putExN :: Builder -> Builder putExN (Builder n old) = Builder (n+4) $ \p i -> do pokeByteOff p i (fromIntegral n :: Word32) old p $ i+4 getExN :: BS.ByteString -> (BS.ByteString, BS.ByteString) getExN bs | len >= 4 , (n :: Word32, bs) <- unsafeBinarySplit bs , n <- fromIntegral n , (len - 4) >= n = (BS.unsafeTake n bs, BS.unsafeDrop n bs) | otherwise = error "getList, corrupted binary" where len = BS.length bs shake-0.16.4/src/General/Bilist.hs0000644000000000000000000000174513261223301015011 0ustar0000000000000000 -- | List type that supports O(1) amortized 'cons', 'snoc', 'uncons' and 'isEmpty'. module General.Bilist( Bilist, cons, snoc, uncons, toList, isEmpty ) where import Data.Semigroup (Semigroup(..)) import Data.Monoid hiding ((<>)) import Prelude data Bilist a = Bilist [a] [a] toList :: Bilist a -> [a] toList (Bilist as bs) = as ++ reverse bs isEmpty :: Bilist a -> Bool isEmpty (Bilist as bs) = null as && null bs instance Eq a => Eq (Bilist a) where a == b = toList a == toList b instance Semigroup (Bilist a) where a <> b = Bilist (toList a ++ toList b) [] instance Monoid (Bilist a) where mempty = Bilist [] [] mappend = (<>) cons :: a -> Bilist a -> Bilist a cons x (Bilist as bs) = Bilist (x:as) bs snoc :: Bilist a -> a -> Bilist a snoc (Bilist as bs) x = Bilist as (x:bs) uncons :: Bilist a -> Maybe (a, Bilist a) uncons (Bilist [] []) = Nothing uncons (Bilist (a:as) bs) = Just (a, Bilist as bs) uncons (Bilist [] bs) = uncons $ Bilist (reverse bs) [] shake-0.16.4/src/General/Bag.hs0000644000000000000000000000223413261223301014246 0ustar0000000000000000 -- | A bag of elements that you can pull at either deterministically or randomly. module General.Bag( Bag, Randomly, emptyPure, emptyRandom, insert, remove ) where import qualified Data.HashMap.Strict as Map import System.Random -- Monad for random (but otherwise pure) computations type Randomly a = IO a data Bag a = BagPure [a] | BagRandom {-# UNPACK #-} !Int (Map.HashMap Int a) -- HashMap has O(n) Map.size so we record it separately emptyPure :: Bag a emptyPure = BagPure [] emptyRandom :: Bag a emptyRandom = BagRandom 0 Map.empty insert :: a -> Bag a -> Bag a insert x (BagPure xs) = BagPure $ x:xs insert x (BagRandom n mp) = BagRandom (n+1) $ Map.insert n x mp remove :: Bag a -> Maybe (Randomly (a, Bag a)) remove (BagPure []) = Nothing remove (BagPure (x:xs)) = Just $ return (x, BagPure xs) remove (BagRandom n mp) | n == 0 = Nothing | n == 1 = Just $ return (mp Map.! 0, emptyRandom) | otherwise = Just $ do i <- randomRIO (0, n-1) let mp2 | i == n-1 = Map.delete i mp | otherwise = Map.insert i (mp Map.! (n-1)) $ Map.delete (n-1) mp return (mp Map.! i, BagRandom (n-1) mp2) shake-0.16.4/src/Development/0000755000000000000000000000000013261223301014125 5ustar0000000000000000shake-0.16.4/src/Development/Shake.hs0000644000000000000000000002270213261223301015517 0ustar0000000000000000{-# LANGUAGE TypeFamilies, ConstraintKinds #-} -- | This module is used for defining Shake build systems. As a simple example of a Shake build system, -- let us build the file @result.tar@ from the files listed by @result.txt@: -- -- @ -- import "Development.Shake" -- import "Development.Shake.FilePath" -- -- main = 'shakeArgs' 'shakeOptions' $ do -- 'want' [\"result.tar\"] -- \"*.tar\" '%>' \\out -> do -- contents \<- 'readFileLines' $ out 'Development.Shake.FilePath.-<.>' \"txt\" -- 'need' contents -- 'cmd' \"tar -cf\" [out] contents -- @ -- -- We start by importing the modules defining both Shake and routines for manipulating 'FilePath' values. -- We define @main@ to call 'shake' with the default 'shakeOptions'. As the second argument to -- 'shake', we provide a set of rules. There are two common forms of rules, 'want' to specify target files, -- and '%>' to define a rule which builds a 'FilePattern'. We use 'want' to require that after the build -- completes the file @result.tar@ should be ready. -- -- The @*.tar@ rule describes how to build files with the extension @.tar@, including @result.tar@. -- We 'readFileLines' on @result.txt@, after changing the @.tar@ extension to @.txt@. We read each line -- into the variable @contents@ -- being a list of the files that should go into @result.tar@. Next, we -- depend ('need') all the files in @contents@. If any of these files change, the rule will be repeated. -- Finally we call the @tar@ program. If either @result.txt@ changes, or any of the files listed by @result.txt@ -- change, then @result.tar@ will be rebuilt. -- -- To find out more: -- -- * The user manual contains a longer example and background information on how to use Shake -- . -- -- * The home page has links to additional information , including -- a mailing list. -- -- * The theory behind Shake is described in an ICFP 2012 paper, -- . -- The forms a short overview of Shake . module Development.Shake( -- * Writing a build system -- $writing -- * GHC build flags -- $flags -- * Core shake, shakeOptions, Rules, action, withoutActions, alternatives, priority, Action, traced, liftIO, actionOnException, actionFinally, runAfter, ShakeException(..), -- * Configuration ShakeOptions(..), Rebuild(..), Lint(..), Change(..), getShakeOptions, getShakeOptionsRules, getHashedShakeVersion, getShakeExtra, getShakeExtraRules, addShakeExtra, -- ** Command line shakeArgs, shakeArgsWith, shakeArgsOptionsWith, shakeOptDescrs, -- ** Progress reporting Progress(..), progressSimple, progressDisplay, progressTitlebar, progressProgram, getProgress, -- ** Verbosity Verbosity(..), getVerbosity, putLoud, putNormal, putQuiet, withVerbosity, quietly, -- * Running commands command, command_, cmd, cmd_, unit, Stdout(..), Stderr(..), Stdouterr(..), Exit(..), Process(..), CmdTime(..), CmdLine(..), CmdResult, CmdString, CmdOption(..), addPath, addEnv, -- * Explicit parallelism parallel, forP, par, -- * Utility functions copyFile', copyFileChanged, readFile', readFileLines, writeFile', writeFileLines, writeFileChanged, removeFiles, removeFilesAfter, withTempFile, withTempDir, withTempFileWithin, withTempDirWithin, -- * File rules need, want, (%>), (|%>), (?>), phony, (~>), phonys, (&%>), (&?>), orderOnly, orderOnlyAction, FilePattern, (?==), (), filePattern, needed, trackRead, trackWrite, trackAllow, -- * Directory rules doesFileExist, doesDirectoryExist, getDirectoryContents, getDirectoryFiles, getDirectoryDirs, getDirectoryFilesIO, -- * Environment rules getEnv, getEnvWithDefault, -- * Oracle rules ShakeValue, RuleResult, addOracle, addOracleCache, askOracle, -- * Special rules alwaysRerun, -- * Resources Resource, newResource, newResourceIO, withResource, withResources, newThrottle, newThrottleIO, unsafeExtraThread, -- * Cache newCache, newCacheIO, -- * Batching needHasChanged, resultHasChanged, batch, -- * Deprecated (*>), (|*>), (&*>), (**>), (*>>), (?>>), askOracleWith ) where import Prelude(Maybe, FilePath) -- Since GHC 7.10 duplicates *> -- I would love to use module export in the above export list, but alas Haddock -- then shows all the things that are hidden in the docs, which is terrible. import Control.Monad.IO.Class import Development.Shake.Internal.Value import Development.Shake.Internal.Options import Development.Shake.Internal.Core.Run import Development.Shake.Internal.Core.Rules import Development.Shake.Internal.Derived import Development.Shake.Internal.Errors import Development.Shake.Internal.Progress import Development.Shake.Internal.Args import Development.Shake.Internal.Shake import Development.Shake.Command import Development.Shake.Internal.FilePattern import Development.Shake.Internal.Rules.Directory import Development.Shake.Internal.Rules.File import Development.Shake.Internal.Rules.Files import Development.Shake.Internal.Rules.Oracle import Development.Shake.Internal.Rules.OrderOnly import Development.Shake.Internal.Rules.Rerun -- $writing -- -- When writing a Shake build system, start by defining what you 'want', then write rules -- with '%>' to produce the results. Before calling 'cmd' you should ensure that any files the command -- requires are demanded with calls to 'need'. We offer the following advice to Shake users: -- -- * If @ghc --make@ or @cabal@ is capable of building your project, use that instead. Custom build systems are -- necessary for many complex projects, but many projects are not complex. -- -- * The 'shakeArgs' function automatically handles command line arguments. To define non-file targets use 'phony'. -- -- * Put all result files in a distinguished directory, for example @_make@. You can implement a @clean@ -- command by removing that directory, using @'removeFilesAfter' \"_make\" [\"\/\/\*\"]@. -- -- * To obtain parallel builds set 'shakeThreads' to a number greater than 1. -- -- * Lots of compilers produce @.o@ files. To avoid overlapping rules, use @.c.o@ for C compilers, -- @.hs.o@ for Haskell compilers etc. -- -- * Do not be afraid to mix Shake rules, system commands and other Haskell libraries -- use each for what -- it does best. -- -- * The more accurate the dependencies are, the better. Use additional rules like 'doesFileExist' and -- 'getDirectoryFiles' to track information other than just the contents of files. For information in the environment -- that you suspect will change regularly (perhaps @ghc@ version number), either write the information to -- a file with 'alwaysRerun' and 'writeFileChanged', or use 'addOracle'. -- $flags -- -- For large build systems the choice of GHC flags can have a significant impact. We recommend: -- -- > ghc --make MyBuildSystem -threaded -rtsopts "-with-rtsopts=-I0 -qg -qb" -- -- * @-rtsopts@: Allow the setting of further GHC options at runtime. -- -- * @-I0@: Disable idle garbage collection, to avoid frequent unnecessary garbage collection, see -- . -- -- * With GHC 7.6 and before, omit @-threaded@: -- can cause a race condition in build systems that write files then read them. Omitting @-threaded@ will -- still allow your 'cmd' actions to run in parallel, so most build systems will still run in parallel. -- -- * With GHC 7.8 and later you may add @-threaded@, and pass the options @-qg -qb@ to @-with-rtsopts@ -- to disable parallel garbage collection. Parallel garbage collection in Shake -- programs typically goes slower than sequential garbage collection, while occupying many cores that -- could be used for running system commands. --------------------------------------------------------------------- -- DEPRECATED SINCE 0.13, MAY 2014 infix 1 **>, ?>>, *>> -- | /Deprecated:/ Alias for '|%>'. (**>) :: [FilePattern] -> (FilePath -> Action ()) -> Rules () (**>) = (|%>) -- | /Deprecated:/ Alias for '&?>'. (?>>) :: (FilePath -> Maybe [FilePath]) -> ([FilePath] -> Action ()) -> Rules () (?>>) = (&?>) -- | /Deprecated:/ Alias for '&%>'. (*>>) :: [FilePattern] -> ([FilePath] -> Action ()) -> Rules () (*>>) = (&%>) --------------------------------------------------------------------- -- DEPRECATED SINCE 0.14, MAY 2014 infix 1 *>, |*>, &*> -- | /Deprecated:/ Alias for '%>'. Note that @*>@ clashes with a Prelude operator in GHC 7.10. (*>) :: FilePattern -> (FilePath -> Action ()) -> Rules () (*>) = (%>) -- | /Deprecated:/ Alias for '|%>'. (|*>) :: [FilePattern] -> (FilePath -> Action ()) -> Rules () (|*>) = (|%>) -- | /Deprecated:/ Alias for '&%>'. (&*>) :: [FilePattern] -> ([FilePath] -> Action ()) -> Rules () (&*>) = (&%>) --------------------------------------------------------------------- -- DEPRECATED SINCE 0.16.1, NOV 2017 -- | /Depreciated:/ Replace @'askOracleWith' q a@ by @'askOracle' q@ -- since the 'RuleResult' type family now fixes the result type. askOracleWith :: (RuleResult q ~ a, ShakeValue q, ShakeValue a) => q -> a -> Action a askOracleWith question _ = askOracle question shake-0.16.4/src/Development/Shake/0000755000000000000000000000000013261223301015160 5ustar0000000000000000shake-0.16.4/src/Development/Shake/Util.hs0000644000000000000000000001024713261223301016435 0ustar0000000000000000 -- | A module for useful utility functions for Shake build systems. module Development.Shake.Util( parseMakefile, needMakefileDependencies, neededMakefileDependencies, shakeArgsAccumulate, shakeArgsPrune, shakeArgsPruneWith, ) where import Development.Shake import Development.Shake.Internal.Rules.File import qualified Data.ByteString.Char8 as BS import qualified General.Makefile as BS import Data.Tuple.Extra import Control.Applicative import Data.List import General.GetOpt import Data.IORef import Data.Maybe import Control.Monad.Extra import Prelude import System.IO.Extra as IO -- | Given the text of a Makefile, extract the list of targets and dependencies. Assumes a -- small subset of Makefile syntax, mostly that generated by @gcc -MM@. -- -- > parseMakefile "a: b c\nd : e" == [("a",["b","c"]),("d",["e"])] parseMakefile :: String -> [(FilePath, [FilePath])] parseMakefile = map (BS.unpack *** map BS.unpack) . BS.parseMakefile . BS.pack -- | Depend on the dependencies listed in a Makefile. Does not depend on the Makefile itself. -- -- > needMakefileDependencies file = need . concatMap snd . parseMakefile =<< liftIO (readFile file) needMakefileDependencies :: FilePath -> Action () needMakefileDependencies file = needBS . concatMap snd . BS.parseMakefile =<< liftIO (BS.readFile file) -- | Depend on the dependencies listed in a Makefile. Does not depend on the Makefile itself. -- Use this function to indicate that you have /already/ used the files in question. -- -- > neededMakefileDependencies file = needed . concatMap snd . parseMakefile =<< liftIO (readFile file) neededMakefileDependencies :: FilePath -> Action () neededMakefileDependencies file = neededBS . concatMap snd . BS.parseMakefile =<< liftIO (BS.readFile file) -- | Like `shakeArgsWith`, but instead of accumulating a list of flags, apply functions to a default value. -- Usually used to populate a record structure. As an example of a build system that can use either @gcc@ or @distcc@ for compiling: -- -- @ -- import System.Console.GetOpt -- -- data Flags = Flags {distCC :: Bool} deriving Eq -- flags = [Option \"\" [\"distcc\"] (NoArg $ Right $ \\x -> x{distCC=True}) \"Run distributed.\"] -- -- main = 'shakeArgsAccumulate' 'shakeOptions' flags (Flags False) $ \\flags targets -> return $ Just $ do -- if null targets then 'want' [\"result.exe\"] else 'want' targets -- let compiler = if distCC flags then \"distcc\" else \"gcc\" -- \"*.o\" '%>' \\out -> do -- 'need' ... -- 'cmd' compiler ... -- ... -- @ -- -- Now you can pass @--distcc@ to use the @distcc@ compiler. shakeArgsAccumulate :: ShakeOptions -> [OptDescr (Either String (a -> a))] -> a -> (a -> [String] -> IO (Maybe (Rules ()))) -> IO () shakeArgsAccumulate opts flags def f = shakeArgsWith opts flags $ \flags targets -> f (foldl' (flip ($)) def flags) targets -- | Like 'shakeArgs' but also takes a pruning function. If @--prune@ is passed, then after the build has completed, -- the second argument is called with a list of the files that the build checked were up-to-date. shakeArgsPrune :: ShakeOptions -> ([FilePath] -> IO ()) -> Rules () -> IO () shakeArgsPrune opts prune rules = shakeArgsPruneWith opts prune [] f where f _ files = return $ Just $ if null files then rules else want files >> withoutActions rules -- | A version of 'shakeArgsPrune' that also takes a list of extra options to use. shakeArgsPruneWith :: ShakeOptions -> ([FilePath] -> IO ()) -> [OptDescr (Either String a)] -> ([a] -> [String] -> IO (Maybe (Rules ()))) -> IO () shakeArgsPruneWith opts prune flags act = do let flags2 = Option "P" ["prune"] (NoArg $ Right Nothing) "Remove stale files" : map (fmapOptDescr Just) flags pruning <- newIORef False shakeArgsWith opts flags2 $ \opts args -> case sequence opts of Nothing -> do writeIORef pruning True return Nothing Just opts -> act opts args whenM (readIORef pruning) $ IO.withTempFile $ \file -> do shakeArgsWith opts{shakeLiveFiles=file : shakeLiveFiles opts} flags2 $ \opts args -> act (catMaybes opts) args src <- lines <$> IO.readFile' file prune src shake-0.16.4/src/Development/Shake/Rule.hs0000644000000000000000000000125413261223301016425 0ustar0000000000000000 -- | This module is used for defining new types of rules for Shake build systems. -- Most users will find the built-in set of rules sufficient. module Development.Shake.Rule( -- * Defining builtin rules addBuiltinRule, BuiltinLint, noLint, BuiltinRun, RunChanged(..), RunResult(..), -- * Calling builtin rules apply, apply1, -- * User rules UserRule(..), addUserRule, getUserRules, userRuleMatch, -- * Lint integration trackUse, trackChange, trackAllow ) where import Development.Shake.Internal.Core.Types import Development.Shake.Internal.Core.Action import Development.Shake.Internal.Core.Run import Development.Shake.Internal.Core.Rules shake-0.16.4/src/Development/Shake/Forward.hs0000644000000000000000000001042413261223301017121 0ustar0000000000000000{-# LANGUAGE GeneralizedNewtypeDeriving, DeriveDataTypeable, Rank2Types, ScopedTypeVariables #-} {-# LANGUAGE TypeFamilies #-} -- | A module for producing forward-defined build systems, in contrast to standard backwards-defined -- build systems such as shake. Based around ideas from . -- As an example: -- -- @ -- import "Development.Shake" -- import "Development.Shake.Forward" -- import "Development.Shake.FilePath" -- -- main = 'shakeArgsForward' 'shakeOptions' $ do -- contents <- 'readFileLines' \"result.txt\" -- 'cache' $ 'cmd' \"tar -cf result.tar\" contents -- @ -- -- Compared to backward-defined build systems (such as normal Shake), forward-defined build -- systems tend to be simpler for simple systems (less boilerplate, more direct style), but more -- complex for larger build systems (requires explicit parallelism, explicit sharing of build products, -- no automatic command line targets). As a general approach for writing forward-defined systems: -- -- * Figure out the sequence of system commands that will build your project. -- -- * Write a simple 'Action' that builds your project. -- -- * Insert 'cache' in front of most system commands. -- -- * Replace most loops with 'forP', where they can be executed in parallel. -- -- * Where Haskell performs real computation, if zero-build performance is insufficient, use 'cacheAction'. -- -- All forward-defined systems use 'AutoDeps', which requires @fsatrace@ to be on the @$PATH@. -- You can obtain @fsatrace@ from . module Development.Shake.Forward( shakeForward, shakeArgsForward, forwardOptions, forwardRule, cache, cacheAction ) where import Development.Shake import Development.Shake.Rule import Development.Shake.Command import Development.Shake.Classes import Development.Shake.FilePath import Data.IORef import Data.Either import Data.List.Extra import Control.Exception.Extra import Numeric import System.IO.Unsafe import qualified Data.ByteString as BS import qualified Data.HashMap.Strict as Map {-# NOINLINE forwards #-} forwards :: IORef (Map.HashMap ForwardQ (Action ())) forwards = unsafePerformIO $ newIORef Map.empty newtype ForwardQ = ForwardQ String deriving (Hashable,Typeable,Eq,NFData,Binary) type instance RuleResult ForwardQ = () instance Show ForwardQ where show (ForwardQ x) = x -- | Run a forward-defined build system. shakeForward :: ShakeOptions -> Action () -> IO () shakeForward opts act = shake (forwardOptions opts) (forwardRule act) -- | Run a forward-defined build system, interpreting command-line arguments. shakeArgsForward :: ShakeOptions -> Action () -> IO () shakeArgsForward opts act = shakeArgs (forwardOptions opts) (forwardRule act) -- | Given an 'Action', turn it into a 'Rules' structure which runs in forward mode. forwardRule :: Action () -> Rules () forwardRule act = do addBuiltinRule noLint $ \k old dirty -> case old of Just old | not dirty -> return $ RunResult ChangedNothing old () _ -> do res <- liftIO $ atomicModifyIORef forwards $ \mp -> (Map.delete k mp, Map.lookup k mp) case res of Nothing -> liftIO $ errorIO "Failed to find action name" Just act -> act return $ RunResult ChangedRecomputeSame BS.empty () action act -- | Given a 'ShakeOptions', set the options necessary to execute in forward mode. forwardOptions :: ShakeOptions -> ShakeOptions forwardOptions opts = opts{shakeCommandOptions=[AutoDeps]} -- | Cache an action. The name of the action must be unique for all different actions. cacheAction :: String -> Action () -> Action () cacheAction name action = do let key = ForwardQ name liftIO $ atomicModifyIORef forwards $ \mp -> (Map.insert key action mp, ()) _ :: [()] <- apply [key] liftIO $ atomicModifyIORef forwards $ \mp -> (Map.delete key mp, ()) -- | Apply caching to an external command. cache :: (forall r . CmdArguments r => r) -> Action () cache cmd = do let CmdArgument args = cmd let isDull ['-',x] = True; isDull _ = False let name = head $ filter (not . isDull) (drop 1 $ rights args) ++ ["unknown"] cacheAction ("command " ++ toStandard name ++ " #" ++ upper (showHex (abs $ hash $ show args) "")) cmd shake-0.16.4/src/Development/Shake/FilePath.hs0000644000000000000000000001000413261223301017203 0ustar0000000000000000{-# LANGUAGE CPP #-} #ifndef MIN_VERSION_filepath #if __GLASGOW_HASKELL__ >= 709 #define MIN_VERSION_filepath(a,b,c) 1 #else #define MIN_VERSION_filepath(a,b,c) 0 #endif #endif -- | A module for 'FilePath' operations exposing "System.FilePath" plus some additional operations. -- -- /Windows note:/ The extension methods ('<.>', 'takeExtension' etc) use the Posix variants since on -- Windows @\"\/\/\*\" '<.>' \"txt\"@ produces @\"\/\/\*\\\\.txt\"@ -- (which is bad for 'Development.Shake.FilePattern' values). module Development.Shake.FilePath( module System.FilePath, module System.FilePath.Posix, dropDirectory1, takeDirectory1, normaliseEx, #if !MIN_VERSION_filepath(1,4,0) (-<.>), #endif toNative, toStandard, exe ) where import System.Info.Extra import qualified System.FilePath as Native import System.FilePath hiding (splitExtension, takeExtension, replaceExtension, dropExtension, addExtension ,hasExtension, (<.>), splitExtensions, takeExtensions, dropExtensions #if MIN_VERSION_filepath(1,4,0) ,(-<.>) #endif ) import System.FilePath.Posix (splitExtension, takeExtension, replaceExtension, dropExtension, addExtension ,hasExtension, (<.>), splitExtensions, takeExtensions, dropExtensions #if MIN_VERSION_filepath(1,4,0) ,(-<.>) #endif ) #if !MIN_VERSION_filepath(1,4,0) infixr 7 -<.> -- | Remove the current extension and add another, an alias for 'replaceExtension'. (-<.>) :: FilePath -> String -> FilePath (-<.>) = replaceExtension #endif -- | Drop the first directory from a 'FilePath'. Should only be used on -- relative paths. -- -- > dropDirectory1 "aaa/bbb" == "bbb" -- > dropDirectory1 "aaa/" == "" -- > dropDirectory1 "aaa" == "" -- > dropDirectory1 "" == "" dropDirectory1 :: FilePath -> FilePath dropDirectory1 = drop 1 . dropWhile (not . isPathSeparator) -- | Take the first component of a 'FilePath'. Should only be used on -- relative paths. -- -- > takeDirectory1 "aaa/bbb" == "aaa" -- > takeDirectory1 "aaa/" == "aaa" -- > takeDirectory1 "aaa" == "aaa" takeDirectory1 :: FilePath -> FilePath takeDirectory1 = takeWhile (not . isPathSeparator) -- | Normalise a 'FilePath', applying the rules: -- -- * All 'pathSeparators' become 'pathSeparator' (@\/@ on Linux, @\\@ on Windows) -- -- * @foo\/bar\/..\/baz@ becomes @foo\/baz@ (not universally true in the presence of symlinks) -- -- * @foo\/.\/bar@ becomes @foo\/bar@ -- -- * @foo\/\/bar@ becomes @foo\/bar@ -- -- This function is not based on the 'normalise' function from the @filepath@ library, as that function -- is quite broken. normaliseEx :: FilePath -> FilePath normaliseEx xs | a:b:xs <- xs, isWindows && sep a && sep b = '/' : f ('/':xs) -- account for UNC paths being double // | otherwise = f xs where sep = Native.isPathSeparator f o = toNative $ deslash o $ (++"/") $ concatMap ('/':) $ reverse $ g 0 $ reverse $ split o deslash o x | x == "/" = case (pre,pos) of (True,True) -> "/" (True,False) -> "/." (False,True) -> "./" (False,False) -> "." | otherwise = (if pre then id else tail) $ (if pos then id else init) x where pre = sep $ head $ o ++ " " pos = sep $ last $ " " ++ o g i [] = replicate i ".." g i ("..":xs) = g (i+1) xs g i (".":xs) = g i xs g 0 (x:xs) = x : g 0 xs g i (x:xs) = g (i-1) xs split xs = if null ys then [] else a : split b where (a,b) = break sep ys ys = dropWhile sep xs -- | Convert to native path separators, namely @\\@ on Windows. toNative :: FilePath -> FilePath toNative = if isWindows then map (\x -> if x == '/' then '\\' else x) else id -- | Convert all path separators to @/@, even on Windows. toStandard :: FilePath -> FilePath toStandard = if isWindows then map (\x -> if x == '\\' then '/' else x) else id -- | The extension of executables, @\"exe\"@ on Windows and @\"\"@ otherwise. exe :: String exe = if isWindows then "exe" else "" shake-0.16.4/src/Development/Shake/Config.hs0000644000000000000000000001116613261223301016726 0ustar0000000000000000{-# LANGUAGE DeriveDataTypeable, GeneralizedNewtypeDeriving, TypeFamilies #-} -- | A module for parsing and using config files in a Shake build system. Config files -- consist of variable bindings, for example: -- -- > # This is my Config file -- > HEADERS_DIR = /path/to/dir -- > CFLAGS = -g -I${HEADERS_DIR} -- > CFLAGS = $CFLAGS -O2 -- > include extra/file.cfg -- -- This defines the variable @HEADERS_DIR@ (equal to @\/path\/to\/dir@), and -- @CFLAGS@ (equal to @-g -I\/path\/to\/dir -O2@), and also includes the configuration -- statements in the file @extra/file.cfg@. The full lexical syntax for configuration -- files is defined here: . -- The use of Ninja file syntax is due to convenience and the desire to reuse an -- externally-defined specification (but the choice of configuration language is mostly arbitrary). -- -- To use the configuration file either use 'readConfigFile' to parse the configuration file -- and use the values directly, or 'usingConfigFile' and 'getConfig' to track the configuration -- values, so they become build dependencies. module Development.Shake.Config( readConfigFile, readConfigFileWithEnv, usingConfigFile, usingConfig, getConfig, getConfigKeys ) where import Development.Shake import Development.Shake.Classes import qualified Development.Ninja.Parse as Ninja import qualified Development.Ninja.Env as Ninja import qualified Data.HashMap.Strict as Map import qualified Data.ByteString.UTF8 as UTF8 import Control.Applicative import Data.Tuple.Extra import Data.List import Prelude -- | Read a config file, returning a list of the variables and their bindings. -- Config files use the Ninja lexical syntax: -- readConfigFile :: FilePath -> IO (Map.HashMap String String) readConfigFile = readConfigFileWithEnv [] -- | Read a config file with an initial environment, returning a list of the variables and their bindings. -- Config files use the Ninja lexical syntax: -- readConfigFileWithEnv :: [(String, String)] -> FilePath -> IO (Map.HashMap String String) readConfigFileWithEnv vars file = do env <- Ninja.newEnv mapM_ (uncurry (Ninja.addEnv env) . (UTF8.fromString *** UTF8.fromString)) vars Ninja.parse file env mp <- Ninja.fromEnv env return $ Map.fromList $ map (UTF8.toString *** UTF8.toString) $ Map.toList mp newtype Config = Config String deriving (Show,Typeable,Eq,Hashable,Binary,NFData) newtype ConfigKeys = ConfigKeys () deriving (Show,Typeable,Eq,Hashable,Binary,NFData) type instance RuleResult Config = Maybe String type instance RuleResult ConfigKeys = [String] -- | Specify the file to use with 'getConfig'. usingConfigFile :: FilePath -> Rules () usingConfigFile file = do mp <- newCache $ \() -> do need [file] liftIO $ readConfigFile file addOracle $ \(Config x) -> Map.lookup x <$> mp () addOracle $ \(ConfigKeys ()) -> sort . Map.keys <$> mp () return () -- | Specify the values to use with 'getConfig', generally prefer -- 'usingConfigFile' unless you also need access to the values -- of variables outside 'Action'. usingConfig :: Map.HashMap String String -> Rules () usingConfig mp = do addOracle $ \(Config x) -> return $ Map.lookup x mp addOracle $ \(ConfigKeys ()) -> return $ sort $ Map.keys mp return () -- | Obtain the value of a configuration variable, returns 'Nothing' to indicate the variable -- has no binding. Any build system using 'getConfig' /must/ call either 'usingConfigFile' -- or 'usingConfig'. The 'getConfig' function will introduce a dependency on the configuration -- variable (but not the whole configuration file), and if the configuration variable changes, the rule will be rerun. -- As an example: -- -- @ -- 'usingConfigFile' \"myconfiguration.cfg\" -- \"*.o\" '%>' \\out -> do -- cflags <- 'getConfig' \"CFLAGS\" -- 'cmd' \"gcc\" [out '-<.>' \"c\"] (fromMaybe \"\" cflags) -- @ getConfig :: String -> Action (Maybe String) getConfig = askOracle . Config -- | Obtain the configuration keys. -- Any build system using 'getConfigKeys' /must/ call either 'usingConfigFile' or 'usingConfig'. -- The 'getConfigKeys' function will introduce a dependency on the configuration keys -- (but not the whole configuration file), and if the configuration keys change, the rule will be rerun. -- Usually use as part of an action. -- As an example: -- -- @ -- 'usingConfigFile' \"myconfiguration.cfg\" -- 'action' $ need =<< getConfigKeys -- @ getConfigKeys :: Action [String] getConfigKeys = askOracle $ ConfigKeys () shake-0.16.4/src/Development/Shake/Command.hs0000644000000000000000000007105413261223301017101 0ustar0000000000000000{-# LANGUAGE CPP #-} {-# LANGUAGE FlexibleInstances, TypeSynonymInstances, TypeOperators, ScopedTypeVariables, NamedFieldPuns #-} {-# LANGUAGE GADTs, GeneralizedNewtypeDeriving #-} #if __GLASGOW_HASKELL__ < 710 {-# LANGUAGE OverlappingInstances #-} #endif -- | This module provides functions for calling command line programs, primarily -- 'command' and 'cmd'. As a simple example: -- -- @ -- 'command' [] \"gcc\" [\"-c\",myfile] -- @ -- -- The functions from this module are now available directly from "Development.Shake". -- You should only need to import this module if you are using the 'cmd' function in the 'IO' monad. module Development.Shake.Command( command, command_, cmd, cmd_, unit, CmdArgument(..), CmdArguments(..), IsCmdArgument(..), (:->), Stdout(..), Stderr(..), Stdouterr(..), Exit(..), Process(..), CmdTime(..), CmdLine(..), CmdResult, CmdString, CmdOption(..), addPath, addEnv, ) where import Data.Tuple.Extra import Control.Monad.Extra import Control.Monad.IO.Class import Data.Char import Data.Either.Extra import Data.List.Extra import Data.Maybe import Data.Semigroup (Semigroup) import Data.Monoid import System.Directory import System.Environment.Extra import System.Exit import System.IO.Extra hiding (withTempFile, withTempDir) import System.Process import System.Info.Extra import System.Time.Extra import System.IO.Unsafe(unsafeInterleaveIO) import qualified Data.ByteString as BS import qualified Data.ByteString.Lazy.Char8 as LBS import General.Extra import General.Process import Control.Applicative import Prelude import Development.Shake.Internal.CmdOption import Development.Shake.Internal.Core.Run import Development.Shake.FilePath import Development.Shake.Internal.FilePattern import Development.Shake.Internal.Options import Development.Shake.Internal.Rules.File import Development.Shake.Internal.Derived --------------------------------------------------------------------- -- ACTUAL EXECUTION -- | /Deprecated:/ Use 'AddPath'. This function will be removed in a future version. -- -- Add a prefix and suffix to the @$PATH@ environment variable. For example: -- -- @ -- opt <- 'addPath' [\"\/usr\/special\"] [] -- 'cmd' opt \"userbinary --version\" -- @ -- -- Would prepend @\/usr\/special@ to the current @$PATH@, and the command would pick -- @\/usr\/special\/userbinary@, if it exists. To add other variables see 'addEnv'. addPath :: MonadIO m => [String] -> [String] -> m CmdOption addPath pre post = do args <- liftIO getEnvironment let (path,other) = partition ((== "PATH") . (if isWindows then upper else id) . fst) args return $ Env $ [("PATH",intercalate [searchPathSeparator] $ pre ++ post) | null path] ++ [(a,intercalate [searchPathSeparator] $ pre ++ [b | b /= ""] ++ post) | (a,b) <- path] ++ other -- | /Deprecated:/ Use 'AddEnv'. This function will be removed in a future version. -- -- Add a single variable to the environment. For example: -- -- @ -- opt <- 'addEnv' [(\"CFLAGS\",\"-O2\")] -- 'cmd' opt \"gcc -c main.c\" -- @ -- -- Would add the environment variable @$CFLAGS@ with value @-O2@. If the variable @$CFLAGS@ -- was already defined it would be overwritten. If you wish to modify @$PATH@ see 'addPath'. addEnv :: MonadIO m => [(String, String)] -> m CmdOption addEnv extra = do args <- liftIO getEnvironment return $ Env $ extra ++ filter (\(a,_) -> a `notElem` map fst extra) args data Str = Str String | BS BS.ByteString | LBS LBS.ByteString | Unit deriving Eq data Result = ResultStdout Str | ResultStderr Str | ResultStdouterr Str | ResultCode ExitCode | ResultTime Double | ResultLine String | ResultProcess PID deriving Eq data PID = PID0 | PID ProcessHandle instance Eq PID where _ == _ = True --------------------------------------------------------------------- -- ACTION EXPLICIT OPERATION -- | Given explicit operations, apply the advance ones, like skip/trace/track/autodep commandExplicit :: String -> [CmdOption] -> [Result] -> String -> [String] -> Action [Result] commandExplicit funcName oopts results exe args = do ShakeOptions {shakeCommandOptions,shakeRunCommands ,shakeLint,shakeLintInside,shakeLintIgnore} <- getShakeOptions let fopts = shakeCommandOptions ++ oopts let useShell = Shell `elem` fopts let useLint = shakeLint == Just LintFSATrace let useAutoDeps = AutoDeps `elem` fopts let opts = filter (/= Shell) fopts let skipper act = if null results && not shakeRunCommands then return [] else act let verboser act = do let cwd = listToMaybe $ reverse [x | Cwd x <- opts] putLoud $ maybe "" (\x -> "cd " ++ x ++ "; ") cwd ++ if useShell then unwords $ exe : args else showCommandForUser2 exe args verb <- getVerbosity -- run quietly to supress the tracer (don't want to print twice) (if verb >= Loud then quietly else id) act let tracer = case reverse [x | Traced x <- opts] of "":_ -> liftIO msg:_ -> traced msg _ | useShell -> traced $ takeFileName $ fst $ word1 exe [] -> traced $ takeFileName exe let tracker act | useLint = fsatrace act | useAutoDeps = autodeps act | useShell = shelled act | otherwise = act exe args shelled = runShell (unwords $ exe : args) ignore = map (?==) shakeLintIgnore ham cwd xs = [makeRelative cwd x | x <- map toStandard xs , any (`isPrefixOf` x) shakeLintInside , not $ any ($ x) ignore] fsaCmd act opts file | isMac = fsaCmdMac act opts file | useShell = runShell (unwords $ exe : args) $ \exe args -> act "fsatrace" $ opts : file : "--" : exe : args | otherwise = act "fsatrace" $ opts : file : "--" : exe : args fsaCmdMac act opts file = do let fakeExe e = liftIO $ do me <- findExecutable e case me of Just re -> do let isSystem = any (`isPrefixOf` re) [ "/bin" , "/usr" , "/sbin" ] if isSystem then do tmpdir <- getTemporaryDirectory let fake = tmpdir ++ "fsatrace-fakes" ++ re unlessM (doesFileExist fake) $ do createDirectoryRecursive $ takeDirectory fake copyFile re fake return fake else return re Nothing -> return e fexe <- fakeExe exe if useShell then do fsh <- fakeExe "/bin/sh" act "fsatrace" $ opts : file : "--" : fsh : "-c" : [unwords $ fexe : args] else act "fsatrace" $ opts : file : "--" : fexe : args fsatrace act = withTempFile $ \file -> do res <- fsaCmd act "rwm" file xs <- liftIO $ parseFSAT <$> readFileUTF8' file cwd <- liftIO getCurrentDirectory let reader (FSATRead x) = Just x; reader _ = Nothing writer (FSATWrite x) = Just x; writer (FSATMove x _) = Just x; writer _ = Nothing existing f = liftIO . filterM doesFileExist . nubOrd . mapMaybe f rs <- existing reader xs ws <- existing writer xs let reads = ham cwd rs writes = ham cwd ws when useAutoDeps $ unsafeAllowApply $ needed reads trackRead reads trackWrite writes return res autodeps act = withTempFile $ \file -> do res <- fsaCmd act "r" file pxs <- liftIO $ parseFSAT <$> readFileUTF8' file xs <- liftIO $ filterM doesFileExist [x | FSATRead x <- pxs] cwd <- liftIO getCurrentDirectory unsafeAllowApply $ need $ ham cwd xs return res skipper $ tracker $ \exe args -> verboser $ tracer $ commandExplicitIO funcName opts results exe args -- | Given a shell command, call the continuation with the sanitised exec-style arguments runShell :: String -> (String -> [String] -> Action a) -> Action a runShell x act | not isWindows = act "/bin/sh" ["-c",x] -- do exactly what Haskell does runShell x act = withTempDir $ \dir -> do let file = dir "s.bat" writeFile' file x act "cmd.exe" ["/d/q/c",file] -- | Parse the FSATrace structure data FSAT = FSATWrite FilePath | FSATRead FilePath | FSATDelete FilePath | FSATMove FilePath FilePath -- | Parse the 'FSAT' entries, ignoring anything you don't understand. parseFSAT :: String -> [FSAT] parseFSAT = mapMaybe f . lines where f ('w':'|':xs) = Just $ FSATWrite xs f ('r':'|':xs) = Just $ FSATRead xs f ('d':'|':xs) = Just $ FSATDelete xs f ('m':'|':xs) | (xs,'|':ys) <- break (== '|') xs = Just $ FSATMove xs ys f _ = Nothing --------------------------------------------------------------------- -- IO EXPLICIT OPERATION -- | Given a very explicit set of CmdOption, translate them to a General.Process structure commandExplicitIO :: String -> [CmdOption] -> [Result] -> String -> [String] -> IO [Result] commandExplicitIO funcName opts results exe args = do let (grabStdout, grabStderr) = both or $ unzip $ for results $ \r -> case r of ResultStdout{} -> (True, False) ResultStderr{} -> (False, True) ResultStdouterr{} -> (True, True) _ -> (False, False) optEnv <- resolveEnv opts let optCwd = let x = last $ "" : [x | Cwd x <- opts] in if x == "" then Nothing else Just x let optStdin = flip mapMaybe opts $ \x -> case x of Stdin x -> Just $ SrcString x StdinBS x -> Just $ SrcBytes x FileStdin x -> Just $ SrcFile x _ -> Nothing let optShell = Shell `elem` opts let optBinary = BinaryPipes `elem` opts let optAsync = ResultProcess PID0 `elem` results let optTimeout = listToMaybe $ reverse [x | Timeout x <- opts] let optWithStdout = last $ False : [x | WithStdout x <- opts] let optWithStderr = last $ True : [x | WithStderr x <- opts] let optFileStdout = [x | FileStdout x <- opts] let optFileStderr = [x | FileStderr x <- opts] let optEchoStdout = last $ (not grabStdout && null optFileStdout) : [x | EchoStdout x <- opts] let optEchoStderr = last $ (not grabStderr && null optFileStderr) : [x | EchoStderr x <- opts] let cmdline = showCommandForUser2 exe args let bufLBS f = do (a,b) <- buf $ LBS LBS.empty; return (a, (\(LBS x) -> f x) <$> b) buf Str{} | optBinary = bufLBS (Str . LBS.unpack) buf Str{} = do x <- newBuffer; return ([DestString x | not optAsync], Str . concat <$> readBuffer x) buf LBS{} = do x <- newBuffer; return ([DestBytes x | not optAsync], LBS . LBS.fromChunks <$> readBuffer x) buf BS {} = bufLBS (BS . BS.concat . LBS.toChunks) buf Unit = return ([], return Unit) (dStdout, dStderr, resultBuild) :: ([[Destination]], [[Destination]], [Double -> ProcessHandle -> ExitCode -> IO Result]) <- fmap unzip3 $ forM results $ \r -> case r of ResultCode _ -> return ([], [], \_ _ ex -> return $ ResultCode ex) ResultTime _ -> return ([], [], \dur _ _ -> return $ ResultTime dur) ResultLine _ -> return ([], [], \_ _ _ -> return $ ResultLine cmdline) ResultProcess _ -> return ([], [], \_ pid _ -> return $ ResultProcess $ PID pid) ResultStdout s -> do (a,b) <- buf s; return (a , [], \_ _ _ -> fmap ResultStdout b) ResultStderr s -> do (a,b) <- buf s; return ([], a , \_ _ _ -> fmap ResultStderr b) ResultStdouterr s -> do (a,b) <- buf s; return (a , a , \_ _ _ -> fmap ResultStdouterr b) exceptionBuffer <- newBuffer po <- resolvePath ProcessOpts {poCommand = if optShell then ShellCommand $ unwords $ exe:args else RawCommand exe args ,poCwd = optCwd, poEnv = optEnv, poTimeout = optTimeout ,poStdin = [SrcBytes LBS.empty | optBinary && not (null optStdin)] ++ optStdin ,poStdout = [DestEcho | optEchoStdout] ++ map DestFile optFileStdout ++ [DestString exceptionBuffer | optWithStdout && not optAsync] ++ concat dStdout ,poStderr = [DestEcho | optEchoStderr] ++ map DestFile optFileStderr ++ [DestString exceptionBuffer | optWithStderr && not optAsync] ++ concat dStderr ,poAsync = optAsync } (dur,(pid,exit)) <- duration $ process po if exit == ExitSuccess || ResultCode ExitSuccess `elem` results then mapM (\f -> f dur pid exit) resultBuild else do exceptionBuffer <- readBuffer exceptionBuffer let captured = ["Stderr" | optWithStderr] ++ ["Stdout" | optWithStdout] cwd <- case optCwd of Nothing -> return "" Just v -> do v <- canonicalizePath v `catchIO` const (return v) return $ "Current directory: " ++ v ++ "\n" fail $ "Development.Shake." ++ funcName ++ ", system command failed\n" ++ "Command: " ++ cmdline ++ "\n" ++ cwd ++ "Exit code: " ++ show (case exit of ExitFailure i -> i; _ -> 0) ++ "\n" ++ if null captured then "Stderr not captured because WithStderr False was used\n" else if null exceptionBuffer then intercalate " and " captured ++ " " ++ (if length captured == 1 then "was" else "were") ++ " empty" else intercalate " and " captured ++ ":\n" ++ unlines (dropWhile null $ lines $ concat exceptionBuffer) -- | Apply all environment operations, to produce a new environment to use. resolveEnv :: [CmdOption] -> IO (Maybe [(String, String)]) resolveEnv opts | null env, null addEnv, null addPath, null remEnv = return Nothing | otherwise = Just . unique . tweakPath . (++ addEnv) . filter (flip notElem remEnv . fst) <$> if null env then getEnvironment else return (concat env) where env = [x | Env x <- opts] addEnv = [(x,y) | AddEnv x y <- opts] remEnv = [x | RemEnv x <- opts] addPath = [(x,y) | AddPath x y <- opts] newPath mid = intercalate [searchPathSeparator] $ concat (reverse $ map fst addPath) ++ [mid | mid /= ""] ++ concatMap snd addPath isPath x = (if isWindows then upper else id) x == "PATH" tweakPath xs | not $ any (isPath . fst) xs = ("PATH", newPath "") : xs | otherwise = map (\(a,b) -> (a, if isPath a then newPath b else b)) xs unique = reverse . nubOrdOn (if isWindows then upper . fst else fst) . reverse -- | If the user specifies a custom $PATH, and not Shell, then try and resolve their exe ourselves. -- Tricky, because on Windows it doesn't look in the $PATH first. resolvePath :: ProcessOpts -> IO ProcessOpts resolvePath po | Just e <- poEnv po , Just (_, path) <- find ((==) "PATH" . (if isWindows then upper else id) . fst) e , RawCommand prog args <- poCommand po = do let progExe = if prog == prog -<.> exe then prog else prog <.> exe -- use unsafeInterleaveIO to allow laziness to skip the queries we don't use pathOld <- unsafeInterleaveIO $ fromMaybe "" <$> lookupEnv "PATH" old <- unsafeInterleaveIO $ findExecutable prog new <- unsafeInterleaveIO $ findExecutableWith (splitSearchPath path) progExe old2 <- unsafeInterleaveIO $ findExecutableWith (splitSearchPath pathOld) progExe switch <- return $ case () of _ | path == pathOld -> False -- The state I can see hasn't changed | Nothing <- new -> False -- I have nothing to offer | Nothing <- old -> True -- I failed last time, so this must be an improvement | Just old <- old, Just new <- new, equalFilePath old new -> False -- no different | Just old <- old, Just old2 <- old2, equalFilePath old old2 -> True -- I could predict last time | otherwise -> False return $ case new of Just new | switch -> po{poCommand = RawCommand new args} _ -> po resolvePath po = return po -- | Given a list of directories, and a file name, return the complete path if you can find it. -- Like findExecutable, but with a custom PATH. findExecutableWith :: [FilePath] -> String -> IO (Maybe FilePath) findExecutableWith path x = flip firstJustM (map ( x) path) $ \s -> ifM (doesFileExist s) (return $ Just s) (return Nothing) --------------------------------------------------------------------- -- FIXED ARGUMENT WRAPPER -- | Collect the @stdout@ of the process. -- If used, the @stdout@ will not be echoed to the terminal, unless you include 'EchoStdout'. -- The value type may be either 'String', or either lazy or strict 'ByteString'. newtype Stdout a = Stdout {fromStdout :: a} -- | Collect the @stderr@ of the process. -- If used, the @stderr@ will not be echoed to the terminal, unless you include 'EchoStderr'. -- The value type may be either 'String', or either lazy or strict 'ByteString'. newtype Stderr a = Stderr {fromStderr :: a} -- | Collect the @stdout@ and @stderr@ of the process. -- If used, the @stderr@ and @stdout@ will not be echoed to the terminal, unless you include 'EchoStdout' and 'EchoStderr'. -- The value type may be either 'String', or either lazy or strict 'ByteString'. newtype Stdouterr a = Stdouterr {fromStdouterr :: a} -- | Collect the 'ExitCode' of the process. -- If you do not collect the exit code, any 'ExitFailure' will cause an exception. newtype Exit = Exit {fromExit :: ExitCode} -- | Collect the 'ProcessHandle' of the process. -- If you do collect the process handle, the command will run asyncronously and the call to 'cmd' \/ 'command' -- will return as soon as the process is spawned. Any 'Stdout' \/ 'Stderr' captures will return empty strings. newtype Process = Process {fromProcess :: ProcessHandle} -- | Collect the time taken to execute the process. Can be used in conjunction with 'CmdLine' to -- write helper functions that print out the time of a result. -- -- @ -- timer :: ('CmdResult' r, MonadIO m) => (forall r . 'CmdResult' r => m r) -> m r -- timer act = do -- ('CmdTime' t, 'CmdLine' x, r) <- act -- liftIO $ putStrLn $ \"Command \" ++ x ++ \" took \" ++ show t ++ \" seconds\" -- return r -- -- run :: IO () -- run = timer $ 'cmd' \"ghc --version\" -- @ newtype CmdTime = CmdTime {fromCmdTime :: Double} -- | Collect the command line used for the process. This command line will be approximate - -- suitable for user diagnostics, but not for direct execution. newtype CmdLine = CmdLine {fromCmdLine :: String} -- | The allowable 'String'-like values that can be captured. class CmdString a where cmdString :: (Str, Str -> a) instance CmdString () where cmdString = (Unit, \Unit -> ()) instance CmdString String where cmdString = (Str "", \(Str x) -> x) instance CmdString BS.ByteString where cmdString = (BS BS.empty, \(BS x) -> x) instance CmdString LBS.ByteString where cmdString = (LBS LBS.empty, \(LBS x) -> x) #if __GLASGOW_HASKELL__ >= 710 class Unit a instance {-# OVERLAPPING #-} Unit b => Unit (a -> b) instance {-# OVERLAPPABLE #-} a ~ () => Unit (m a) #else class Unit a instance Unit b => Unit (a -> b) instance a ~ () => Unit (m a) #endif -- | A class for specifying what results you want to collect from a process. -- Values are formed of 'Stdout', 'Stderr', 'Exit' and tuples of those. class CmdResult a where -- Return a list of results (with the right type but dummy data) -- and a function to transform a populated set of results into a value cmdResult :: ([Result], [Result] -> a) instance CmdResult Exit where cmdResult = ([ResultCode ExitSuccess], \[ResultCode x] -> Exit x) instance CmdResult ExitCode where cmdResult = ([ResultCode ExitSuccess], \[ResultCode x] -> x) instance CmdResult Process where cmdResult = ([ResultProcess PID0], \[ResultProcess (PID x)] -> Process x) instance CmdResult ProcessHandle where cmdResult = ([ResultProcess PID0], \[ResultProcess (PID x)] -> x) instance CmdResult CmdLine where cmdResult = ([ResultLine ""], \[ResultLine x] -> CmdLine x) instance CmdResult CmdTime where cmdResult = ([ResultTime 0], \[ResultTime x] -> CmdTime x) instance CmdString a => CmdResult (Stdout a) where cmdResult = let (a,b) = cmdString in ([ResultStdout a], \[ResultStdout x] -> Stdout $ b x) instance CmdString a => CmdResult (Stderr a) where cmdResult = let (a,b) = cmdString in ([ResultStderr a], \[ResultStderr x] -> Stderr $ b x) instance CmdString a => CmdResult (Stdouterr a) where cmdResult = let (a,b) = cmdString in ([ResultStdouterr a], \[ResultStdouterr x] -> Stdouterr $ b x) instance CmdResult () where cmdResult = ([], \[] -> ()) instance (CmdResult x1, CmdResult x2) => CmdResult (x1,x2) where cmdResult = (a1++a2, \rs -> let (r1,r2) = splitAt (length a1) rs in (b1 r1, b2 r2)) where (a1,b1) = cmdResult (a2,b2) = cmdResult cmdResultWith :: forall b c. CmdResult b => (b -> c) -> ([Result], [Result] -> c) cmdResultWith f = second (f .) cmdResult instance (CmdResult x1, CmdResult x2, CmdResult x3) => CmdResult (x1,x2,x3) where cmdResult = cmdResultWith $ \(a,(b,c)) -> (a,b,c) instance (CmdResult x1, CmdResult x2, CmdResult x3, CmdResult x4) => CmdResult (x1,x2,x3,x4) where cmdResult = cmdResultWith $ \(a,(b,c,d)) -> (a,b,c,d) instance (CmdResult x1, CmdResult x2, CmdResult x3, CmdResult x4, CmdResult x5) => CmdResult (x1,x2,x3,x4,x5) where cmdResult = cmdResultWith $ \(a,(b,c,d,e)) -> (a,b,c,d,e) -- | Execute a system command. Before running 'command' make sure you 'Development.Shake.need' any files -- that are used by the command. -- -- This function takes a list of options (often just @[]@, see 'CmdOption' for the available -- options), the name of the executable (either a full name, or a program on the @$PATH@) and -- a list of arguments. The result is often @()@, but can be a tuple containg any of 'Stdout', -- 'Stderr' and 'Exit'. Some examples: -- -- @ -- 'command_' [] \"gcc\" [\"-c\",\"myfile.c\"] -- compile a file, throwing an exception on failure -- 'Exit' c <- 'command' [] \"gcc\" [\"-c\",myfile] -- run a command, recording the exit code -- ('Exit' c, 'Stderr' err) <- 'command' [] \"gcc\" [\"-c\",\"myfile.c\"] -- run a command, recording the exit code and error output -- 'Stdout' out <- 'command' [] \"gcc\" [\"-MM\",\"myfile.c\"] -- run a command, recording the output -- 'command_' ['Cwd' \"generated\"] \"gcc\" [\"-c\",myfile] -- run a command in a directory -- @ -- -- Unless you retrieve the 'ExitCode' using 'Exit', any 'ExitFailure' will throw an error, including -- the 'Stderr' in the exception message. If you capture the 'Stdout' or 'Stderr', that stream will not be echoed to the console, -- unless you use the option 'EchoStdout' or 'EchoStderr'. -- -- If you use 'command' inside a @do@ block and do not use the result, you may get a compile-time error about being -- unable to deduce 'CmdResult'. To avoid this error, use 'command_'. -- -- By default the @stderr@ stream will be captured for use in error messages, and also echoed. To only echo -- pass @'WithStderr' 'False'@, which causes no streams to be captured by Shake, and certain programs (e.g. @gcc@) -- to detect they are running in a terminal. command :: CmdResult r => [CmdOption] -> String -> [String] -> Action r command opts x xs = b <$> commandExplicit "command" opts a x xs where (a,b) = cmdResult -- | A version of 'command' where you do not require any results, used to avoid errors about being unable -- to deduce 'CmdResult'. command_ :: [CmdOption] -> String -> [String] -> Action () command_ opts x xs = void $ commandExplicit "command_" opts [] x xs --------------------------------------------------------------------- -- VARIABLE ARGUMENT WRAPPER -- | A type annotation, equivalent to the first argument, but in variable argument contexts, -- gives a clue as to what return type is expected (not actually enforced). type a :-> t = a -- | Execute a system command. Before running 'cmd' make sure you 'Development.Shake.need' any files -- that are used by the command. -- -- * @String@ arguments are treated as whitespace separated arguments. -- -- * @[String]@ arguments are treated as literal arguments. -- -- * 'CmdOption' arguments are used as options. -- -- As some examples, here are some calls, and the resulting command string: -- -- @ -- 'cmd_' \"git log --pretty=\" \"oneline\" -- git log --pretty= oneline -- 'cmd_' \"git log --pretty=\" [\"oneline\"] -- git log --pretty= oneline -- 'cmd_' \"git log\" (\"--pretty=\" ++ \"oneline\") -- git log --pretty=oneline -- 'cmd_' \"git log\" (\"--pretty=\" ++ \"one line\") -- git log --pretty=one line -- 'cmd_' \"git log\" [\"--pretty=\" ++ \"one line\"] -- git log "--pretty=one line" -- @ -- -- More examples, including return values, see this translation of the examples given for the 'command' function: -- -- @ -- 'cmd_' \"gcc -c myfile.c\" -- compile a file, throwing an exception on failure -- 'Exit' c <- 'cmd' \"gcc -c\" [myfile] -- run a command, recording the exit code -- ('Exit' c, 'Stderr' err) <- 'cmd' \"gcc -c myfile.c\" -- run a command, recording the exit code and error output -- 'Stdout' out <- 'cmd' \"gcc -MM myfile.c\" -- run a command, recording the output -- 'cmd' ('Cwd' \"generated\") \"gcc -c\" [myfile] :: 'Action' () -- run a command in a directory -- @ -- -- When passing file arguments we use @[myfile]@ so that if the @myfile@ variable contains spaces they are properly escaped. -- -- If you use 'cmd' inside a @do@ block and do not use the result, you may get a compile-time error about being -- unable to deduce 'CmdResult'. To avoid this error, use 'cmd_'. -- -- The 'cmd' function can also be run in the 'IO' monad, but then 'Traced' is ignored and command lines are not echoed. -- As an example: -- -- @ -- 'cmd' ('Cwd' \"generated\") 'Shell' \"gcc -c myfile.c\" :: IO () -- @ cmd :: CmdArguments args => args :-> Action r cmd = cmdArguments mempty -- | See 'cmd'. Same as 'cmd' except with a unit result. -- 'cmd' is to 'cmd_' as 'command' is to 'command_'. cmd_ :: (CmdArguments args, Unit args) => args :-> Action () cmd_ = cmd -- | The arguments to 'cmd' - see 'cmd' for examples and semantics. newtype CmdArgument = CmdArgument [Either CmdOption String] deriving (Eq, Semigroup, Monoid, Show) -- | The arguments to 'cmd' - see 'cmd' for examples and semantics. class CmdArguments t where -- | Arguments to cmd cmdArguments :: CmdArgument -> t instance (IsCmdArgument a, CmdArguments r) => CmdArguments (a -> r) where cmdArguments xs x = cmdArguments $ xs `mappend` toCmdArgument x instance CmdResult r => CmdArguments (Action r) where cmdArguments (CmdArgument x) = case partitionEithers x of (opts, x:xs) -> let (a,b) = cmdResult in b <$> commandExplicit "cmd" opts a x xs _ -> error "Error, no executable or arguments given to Development.Shake.cmd" instance CmdResult r => CmdArguments (IO r) where cmdArguments (CmdArgument x) = case partitionEithers x of (opts, x:xs) -> let (a,b) = cmdResult in b <$> commandExplicitIO "cmd" opts a x xs _ -> error "Error, no executable or arguments given to Development.Shake.cmd" instance CmdArguments CmdArgument where cmdArguments = id -- | Class to convert an a to a CmdArgument class IsCmdArgument a where -- | Conversion to a CmdArgument toCmdArgument :: a -> CmdArgument instance IsCmdArgument String where toCmdArgument = CmdArgument . map Right . words instance IsCmdArgument [String] where toCmdArgument = CmdArgument . map Right instance IsCmdArgument CmdOption where toCmdArgument = CmdArgument . return . Left instance IsCmdArgument [CmdOption] where toCmdArgument = CmdArgument . map Left instance IsCmdArgument a => IsCmdArgument (Maybe a) where toCmdArgument = maybe mempty toCmdArgument --------------------------------------------------------------------- -- UTILITIES -- A better version of showCommandForUser, which doesn't escape so much on Windows showCommandForUser2 :: FilePath -> [String] -> String showCommandForUser2 cmd args = unwords $ map (\x -> if safe x then x else showCommandForUser x []) $ cmd : args where safe xs = not (null xs) && not (any bad xs) bad x = isSpace x || (x == '\\' && not isWindows) || x `elem` "\"\'" shake-0.16.4/src/Development/Shake/Classes.hs0000644000000000000000000000105713261223301017114 0ustar0000000000000000 -- | This module reexports the six necessary type classes that every 'Rule' type must support. -- You can use this module to define new rules without depending on the @binary@, @deepseq@ and @hashable@ packages. module Development.Shake.Classes( Show(..), Typeable(..), Eq(..), Hashable(..), Binary(..), NFData(..) ) where -- I would probably reexport this module by default in Development.Shake, -- but Binary defines 'get', which clashes with the State monad. import Data.Hashable import Data.Typeable import Data.Binary import Control.DeepSeq shake-0.16.4/src/Development/Shake/Internal/0000755000000000000000000000000013261223301016734 5ustar0000000000000000shake-0.16.4/src/Development/Shake/Internal/Value.hs0000644000000000000000000001075713261223301020356 0ustar0000000000000000{-# LANGUAGE ExistentialQuantification, RecordWildCards, ScopedTypeVariables #-} {-# LANGUAGE ConstraintKinds, GeneralizedNewtypeDeriving #-} -- | This module implements the Key/Value types, to abstract over hetrogenous data types. module Development.Shake.Internal.Value( QTypeRep(..), Value, newValue, fromValue, Key, newKey, fromKey, typeKey, ShakeValue ) where import Development.Shake.Classes import Development.Shake.Internal.Errors import Data.Typeable.Extra import Numeric import Data.Bits import Unsafe.Coerce -- | Like TypeRep, but the Show includes enough information to be unique -- so I can rely on @a == b === show a == show b@. newtype QTypeRep = QTypeRep {fromQTypeRep :: TypeRep} deriving (Eq,Hashable) instance NFData QTypeRep where -- Incorrect, but TypeRep doesn't have an NFData until GHC 7.10 -- See https://github.com/haskell/deepseq/issues/37 rnf (QTypeRep x) = x `seq` () instance Show QTypeRep where show (QTypeRep x) = show x ++ " {" ++ showHex (abs $ hashWithSalt 0 x) "" ++ "}" -- | Define an alias for the six type classes required for things involved in Shake rules. -- Using this alias requires the @ConstraintKinds@ extension. -- -- To define your own values meeting the necessary constraints it is convenient to use the extensions -- @GeneralizedNewtypeDeriving@ and @DeriveDataTypeable@ to write: -- -- > newtype MyType = MyType (String, Bool) deriving (Show, Typeable, Eq, Hashable, Binary, NFData) -- -- Shake needs these instances on keys and values. They are used for: -- -- * 'Show' is used to print out keys in errors, profiling, progress messages -- and diagnostics. -- -- * 'Typeable' is used because Shake indexes its database by the -- type of the key and value involved in the rule (overlap is not -- allowed for type classes and not allowed in Shake either). -- -- * 'Eq' and 'Hashable' are used on keys in order to build hash maps -- from keys to values. 'Eq' is used on values to test if the value -- has changed or not (this is used to support unchanging rebuilds, -- where Shake can avoid rerunning rules if it runs a dependency, -- but it turns out that no changes occurred.) The 'Hashable' -- instances are only use at runtime (never serialised to disk), -- so they do not have to be stable across runs. -- Hashable on values is not used, and only required for a consistent interface. -- -- * 'Binary' is used to serialize keys and values into Shake's -- build database; this lets Shake cache values across runs and -- implement unchanging rebuilds. -- -- * 'NFData' is used to avoid space and thunk leaks, especially -- when Shake is parallelized. type ShakeValue a = (Show a, Typeable a, Eq a, Hashable a, Binary a, NFData a) -- We deliberately avoid Typeable instances on Key/Value to stop them accidentally -- being used inside themselves data Key = forall a . Key {keyType :: TypeRep ,keyShow :: a -> String ,keyRnf :: a -> () ,keyEq :: a -> a -> Bool ,keyHash :: Int -> a -> Int ,keyValue :: a } data Value = forall a . Value {valueType :: TypeRep ,valueShow :: a -> String ,valueRnf :: a -> () ,valueValue :: a } newKey :: forall a . ShakeValue a => a -> Key newKey = Key (typeRep (Proxy :: Proxy a)) show rnf (==) hashWithSalt newValue :: forall a . (Typeable a, Show a, NFData a) => a -> Value newValue = Value (typeRep (Proxy :: Proxy a)) show rnf typeKey :: Key -> TypeRep typeKey Key{..} = keyType fromKey :: forall a . Typeable a => Key -> a fromKey Key{..} | keyType == resType = unsafeCoerce keyValue | otherwise = errorInternal $ "fromKey, bad cast, have " ++ show keyType ++ ", wanted " ++ show resType where resType = typeRep (Proxy :: Proxy a) fromValue :: forall a . Typeable a => Value -> a fromValue Value{..} | valueType == resType = unsafeCoerce valueValue | otherwise = errorInternal $ "fromValue, bad cast, have " ++ show valueType ++ ", wanted " ++ show resType where resType = typeRep (Proxy :: Proxy a) instance Show Key where show Key{..} = keyShow keyValue instance Show Value where show Value{..} = valueShow valueValue instance NFData Key where rnf Key{..} = keyRnf keyValue instance NFData Value where rnf Value{..} = valueRnf valueValue instance Hashable Key where hashWithSalt salt Key{..} = hashWithSalt salt keyType `xor` keyHash salt keyValue instance Eq Key where Key{keyType=at,keyValue=a,keyEq=eq} == Key{keyType=bt,keyValue=b} | at /= bt = False | otherwise = eq a (unsafeCoerce b) shake-0.16.4/src/Development/Shake/Internal/Shake.hs0000644000000000000000000000204413261223301020323 0ustar0000000000000000 -- | The main entry point that calls all the default rules module Development.Shake.Internal.Shake(shake) where import Development.Shake.Internal.Options import General.Timing import Development.Shake.Internal.Core.Run import Development.Shake.Internal.Core.Rules import Development.Shake.Internal.Rules.Directory import Development.Shake.Internal.Rules.File import Development.Shake.Internal.Rules.Files import Development.Shake.Internal.Rules.Rerun -- | Main entry point for running Shake build systems. For an example see the top of the module "Development.Shake". -- Use 'ShakeOptions' to specify how the system runs, and 'Rules' to specify what to build. The function will throw -- an exception if the build fails. -- -- To use command line flags to modify 'ShakeOptions' see 'Development.Shake.shakeArgs'. shake :: ShakeOptions -> Rules () -> IO () shake opts r = do addTiming "Function shake" run opts $ do r defaultRuleFile defaultRuleFiles defaultRuleDirectory defaultRuleRerun return () shake-0.16.4/src/Development/Shake/Internal/Resource.hs0000644000000000000000000001606113261223301021063 0ustar0000000000000000{-# LANGUAGE RecordWildCards, ViewPatterns #-} module Development.Shake.Internal.Resource( Resource, newResourceIO, newThrottleIO, acquireResource, releaseResource ) where import Data.Function import System.IO.Unsafe import Control.Concurrent.Extra import Control.Exception.Extra import Data.Tuple.Extra import Control.Monad import General.Bilist import Development.Shake.Internal.Core.Pool import System.Time.Extra import Data.Monoid import Prelude {-# NOINLINE resourceIds #-} resourceIds :: Var Int resourceIds = unsafePerformIO $ newVar 0 resourceId :: IO Int resourceId = modifyVar resourceIds $ \i -> let j = i + 1 in j `seq` return (j, j) -- | A type representing an external resource which the build system should respect. There -- are two ways to create 'Resource's in Shake: -- -- * 'Development.Shake.newResource' creates a finite resource, stopping too many actions running -- simultaneously. -- -- * 'Development.Shake.newThrottle' creates a throttled resource, stopping too many actions running -- over a short time period. -- -- These resources are used with 'Development.Shake.withResource' when defining rules. Typically only -- system commands (such as 'Development.Shake.cmd') should be run inside 'Development.Shake.withResource', -- not commands such as 'Development.Shake.need'. -- -- Be careful that the actions run within 'Development.Shake.withResource' do not themselves require further -- resources, or you may get a \"thread blocked indefinitely in an MVar operation\" exception. -- If an action requires multiple resources, use 'Development.Shake.withResources' to avoid deadlock. data Resource = Resource {resourceOrd :: Int -- ^ Key used for Eq/Ord operations. To make withResources work, we require newResourceIO < newThrottleIO ,resourceShow :: String -- ^ String used for Show ,acquireResource :: Pool -> Int -> IO () -> IO () -- ^ Acquire the resource and call the function. ,releaseResource :: Pool -> Int -> IO () -- ^ You should only ever releaseResource that you obtained with acquireResource. } instance Show Resource where show = resourceShow instance Eq Resource where (==) = (==) `on` resourceOrd instance Ord Resource where compare = compare `on` resourceOrd --------------------------------------------------------------------- -- FINITE RESOURCES data Finite = Finite {finiteAvailable :: !Int -- ^ number of currently available resources ,finiteWaiting :: Bilist (Int, IO ()) -- ^ queue of people with how much they want and the action when it is allocated to them } -- | A version of 'Development.Shake.newResource' that runs in IO, and can be called before calling 'Development.Shake.shake'. -- Most people should use 'Development.Shake.newResource' instead. newResourceIO :: String -> Int -> IO Resource newResourceIO name mx = do when (mx < 0) $ errorIO $ "You cannot create a resource named " ++ name ++ " with a negative quantity, you used " ++ show mx key <- resourceId var <- newVar $ Finite mx mempty return $ Resource (negate key) shw (acquire var) (release var) where shw = "Resource " ++ name acquire :: Var Finite -> Pool -> Int -> IO () -> IO () acquire var pool want continue | want < 0 = errorIO $ "You cannot acquire a negative quantity of " ++ shw ++ ", requested " ++ show want | want > mx = errorIO $ "You cannot acquire more than " ++ show mx ++ " of " ++ shw ++ ", requested " ++ show want | otherwise = join $ modifyVar var $ \x@Finite{..} -> return $ if want <= finiteAvailable then (x{finiteAvailable = finiteAvailable - want}, continue) else (x{finiteWaiting = finiteWaiting `snoc` (want, addPoolResume pool continue)}, return ()) release :: Var Finite -> Pool -> Int -> IO () release var _ i = join $ modifyVar var $ \x -> return $ f x{finiteAvailable = finiteAvailable x + i} where f (Finite i (uncons -> Just ((wi,wa),ws))) | wi <= i = second (wa >>) $ f $ Finite (i-wi) ws | otherwise = first (add (wi,wa)) $ f $ Finite i ws f (Finite i _) = (Finite i mempty, return ()) add a s = s{finiteWaiting = a `cons` finiteWaiting s} --------------------------------------------------------------------- -- THROTTLE RESOURCES -- call a function after a certain delay waiter :: Seconds -> IO () -> IO () waiter period act = void $ forkIO $ do sleep period act -- Make sure the pool cannot run try until after you have finished with it blockPool :: Pool -> IO (IO ()) blockPool pool = do bar <- newBarrier addPoolResume pool $ do cancel <- increasePool pool waitBarrier bar cancel return $ signalBarrier bar () data Throttle -- | Some number of resources are available = ThrottleAvailable !Int -- | Some users are blocked (non-empty), plus an action to call once we go back to Available | ThrottleWaiting (IO ()) (Bilist (Int, IO ())) -- | A version of 'Development.Shake.newThrottle' that runs in IO, and can be called before calling 'Development.Shake.shake'. -- Most people should use 'Development.Shake.newThrottle' instead. newThrottleIO :: String -> Int -> Double -> IO Resource newThrottleIO name count period = do when (count < 0) $ errorIO $ "You cannot create a throttle named " ++ name ++ " with a negative quantity, you used " ++ show count key <- resourceId var <- newVar $ ThrottleAvailable count return $ Resource key shw (acquire var) (release var) where shw = "Throttle " ++ name acquire :: Var Throttle -> Pool -> Int -> IO () -> IO () acquire var pool want continue | want < 0 = errorIO $ "You cannot acquire a negative quantity of " ++ shw ++ ", requested " ++ show want | want > count = errorIO $ "You cannot acquire more than " ++ show count ++ " of " ++ shw ++ ", requested " ++ show want | otherwise = join $ modifyVar var $ \x -> case x of ThrottleAvailable i | i >= want -> return (ThrottleAvailable $ i - want, continue) | otherwise -> do stop <- blockPool pool return (ThrottleWaiting stop $ (want - i, addPoolResume pool continue) `cons` mempty, return ()) ThrottleWaiting stop xs -> return (ThrottleWaiting stop $ xs `snoc` (want, addPoolResume pool continue), return ()) release :: Var Throttle -> Pool -> Int -> IO () release var pool n = waiter period $ join $ modifyVar var $ \x -> return $ case x of ThrottleAvailable i -> (ThrottleAvailable $ i+n, return ()) ThrottleWaiting stop xs -> f stop n xs where f stop i (uncons -> Just ((wi,wa),ws)) | i >= wi = second (wa >>) $ f stop (i-wi) ws | otherwise = (ThrottleWaiting stop $ (wi-i,wa) `cons` ws, return ()) f stop i _ = (ThrottleAvailable i, stop) shake-0.16.4/src/Development/Shake/Internal/Progress.hs0000644000000000000000000003734713261223301021112 0ustar0000000000000000{-# LANGUAGE DeriveDataTypeable, RecordWildCards, CPP, ViewPatterns, ForeignFunctionInterface #-} -- | Progress tracking module Development.Shake.Internal.Progress( Progress(..), progressSimple, progressDisplay, progressTitlebar, progressProgram, ProgressEntry(..), progressReplay, writeProgressReport -- INTERNAL USE ONLY ) where import Control.Applicative import Data.Tuple.Extra import Control.Exception.Extra import Control.Monad import System.Environment.Extra import System.Directory import System.Process import System.FilePath import Data.Char import Data.Data import Data.IORef import Data.List import Data.Maybe import qualified Data.ByteString.Char8 as BS import qualified Data.ByteString.Lazy.Char8 as LBS import Numeric.Extra import General.Template import System.IO.Unsafe import Development.Shake.Internal.Paths import System.Time.Extra import Data.Semigroup (Semigroup (..)) import Data.Monoid hiding ((<>)) import Prelude #ifdef mingw32_HOST_OS import Foreign import Foreign.C.Types #ifdef x86_64_HOST_ARCH #define CALLCONV ccall #else #define CALLCONV stdcall #endif foreign import CALLCONV "Windows.h SetConsoleTitleA" c_setConsoleTitle :: Ptr CChar -> IO Bool #endif --------------------------------------------------------------------- -- PROGRESS TYPES - exposed to the user -- | Information about the current state of the build, obtained by either passing a callback function -- to 'Development.Shake.shakeProgress' (asynchronous output) or 'Development.Shake.getProgress' -- (synchronous output). Typically a build system will pass 'progressDisplay' to 'Development.Shake.shakeProgress', -- which will poll this value and produce status messages. data Progress = Progress {isFailure :: !(Maybe String) -- ^ Starts out 'Nothing', becomes 'Just' a target name if a rule fails. ,countSkipped :: {-# UNPACK #-} !Int -- ^ Number of rules which were required, but were already in a valid state. ,countBuilt :: {-# UNPACK #-} !Int -- ^ Number of rules which were have been built in this run. ,countUnknown :: {-# UNPACK #-} !Int -- ^ Number of rules which have been built previously, but are not yet known to be required. ,countTodo :: {-# UNPACK #-} !Int -- ^ Number of rules which are currently required (ignoring dependencies that do not change), but not built. ,timeSkipped :: {-# UNPACK #-} !Double -- ^ Time spent building 'countSkipped' rules in previous runs. ,timeBuilt :: {-# UNPACK #-} !Double -- ^ Time spent building 'countBuilt' rules. ,timeUnknown :: {-# UNPACK #-} !Double -- ^ Time spent building 'countUnknown' rules in previous runs. ,timeTodo :: {-# UNPACK #-} !(Double,Int) -- ^ Time spent building 'countTodo' rules in previous runs, plus the number which have no known time (have never been built before). } deriving (Eq,Ord,Show,Read,Data,Typeable) instance Semigroup Progress where a <> b = Progress {isFailure = isFailure a `mplus` isFailure b ,countSkipped = countSkipped a + countSkipped b ,countBuilt = countBuilt a + countBuilt b ,countUnknown = countUnknown a + countUnknown b ,countTodo = countTodo a + countTodo b ,timeSkipped = timeSkipped a + timeSkipped b ,timeBuilt = timeBuilt a + timeBuilt b ,timeUnknown = timeUnknown a + timeUnknown b ,timeTodo = let (a1,a2) = timeTodo a; (b1,b2) = timeTodo b x1 = a1 + b1; x2 = a2 + b2 in x1 `seq` x2 `seq` (x1,x2) } instance Monoid Progress where mempty = Progress Nothing 0 0 0 0 0 0 0 (0,0) mappend = (<>) --------------------------------------------------------------------- -- MEALY TYPE - for writing the progress functions -- See -- | A machine that takes inputs and produces outputs newtype Mealy i a = Mealy {runMealy :: i -> (a, Mealy i a)} instance Functor (Mealy i) where fmap f (Mealy m) = Mealy $ \i -> case m i of (x, m) -> (f x, fmap f m) instance Applicative (Mealy i) where pure x = let r = Mealy (const (x, r)) in r Mealy mf <*> Mealy mx = Mealy $ \i -> case mf i of (f, mf) -> case mx i of (x, mx) -> (f x, mf <*> mx) echoMealy :: Mealy i i echoMealy = Mealy $ \i -> (i, echoMealy) scanMealy :: (a -> b -> a) -> a -> Mealy i b -> Mealy i a scanMealy f z (Mealy m) = Mealy $ \i -> case m i of (x, m) -> let z2 = f z x in (z2, scanMealy f z2 m) --------------------------------------------------------------------- -- MEALY UTILITIES oldMealy :: a -> Mealy i a -> Mealy i (a,a) oldMealy old = scanMealy (\(_,old) new -> (old,new)) (old,old) latch :: Mealy i (Bool, a) -> Mealy i a latch s = fromJust <$> scanMealy f Nothing s where f old (b,v) = Just $ if b then fromMaybe v old else v iff :: Mealy i Bool -> Mealy i a -> Mealy i a -> Mealy i a iff c t f = (\c t f -> if c then t else f) <$> c <*> t <*> f -- decay'd division, compute a/b, with a decay of f -- r' is the new result, r is the last result -- r' ~= a' / b' -- r' = r*b + f*(a'-a) -- ------------- -- b + f*(b'-b) -- when f == 1, r == r' -- -- both streams must only ever increase decay :: Double -> Mealy i Double -> Mealy i Double -> Mealy i Double decay f a b = scanMealy step 0 $ (,) <$> oldMealy 0 a <*> oldMealy 0 b where step r ((a,a'),(b,b')) = if isNaN r then a' / b' else ((r*b) + f*(a'-a)) / (b + f*(b'-b)) --------------------------------------------------------------------- -- MESSAGE GENERATOR formatMessage :: Double -> Double -> String formatMessage secs perc = (if isNaN secs || secs < 0 then "??s" else showMinSec $ ceiling secs) ++ " (" ++ (if isNaN perc || perc < 0 || perc > 100 then "??" else show $ floor perc) ++ "%)" showMinSec :: Int -> String showMinSec secs = (if m == 0 then "" else show m ++ "m" ++ ['0' | s < 10]) ++ show s ++ "s" where (m,s) = divMod secs 60 liftA2' :: Applicative m => m a -> m b -> (a -> b -> c) -> m c liftA2' a b f = liftA2 f a b -- | return (number of seconds, percentage, explanation) message :: Mealy (Double, Progress) (Double, Progress) -> Mealy (Double, Progress) (Double, Double, String) message input = liftA3 (,,) time perc debug where progress = snd <$> input secs = fst <$> input debug = (\donePerSec ruleTime (todoKnown,todoUnknown) -> "Progress: " ++ "((known=" ++ showDP 2 todoKnown ++ "s) + " ++ "(unknown=" ++ show todoUnknown ++ " * time=" ++ showDP 2 ruleTime ++ "s)) " ++ "(rate=" ++ showDP 2 donePerSec ++ "))") <$> donePerSec <*> ruleTime <*> (timeTodo <$> progress) -- Number of seconds work completed in this build run -- Ignores timeSkipped which would be more truthful, but it makes the % drop sharply -- which isn't what users want done = timeBuilt <$> progress -- Work done per second, don't divide by 0 and don't update if 'done' doesn't change donePerSec = iff ((==) 0 <$> done) (pure 1) perSecStable where perSecStable = latch $ liftA2 (,) (uncurry (==) <$> oldMealy 0 done) perSecRaw perSecRaw = decay 1.2 done secs -- Predicted build time for a rule that has never been built before -- The high decay means if a build goes in "phases" - lots of source files, then lots of compiling -- we reach a reasonable number fairly quickly, without bouncing too much ruleTime = liftA2 weightedAverage (f (decay 10) timeBuilt countBuilt) (f (liftA2 (/)) (fst . timeTodo) (\Progress{..} -> countTodo - snd timeTodo)) -- don't call decay on todo, since it goes up and down (as things get done) where weightedAverage (w1,x1) (w2,x2) | w1 == 0 && w2 == 0 = 0 | otherwise = ((w1 *. x1) + (w2 *. x2)) / intToDouble (w1+w2) where i *. d = if i == 0 then 0 else intToDouble i * d -- since d might be NaN f divide time count = let xs = count <$> progress in liftA2 (,) xs $ divide (time <$> progress) (intToDouble <$> xs) -- Number of seconds work remaining, ignoring multiple threads todo = f <$> progress <*> ruleTime where f Progress{..} ruleTime = fst timeTodo + (fromIntegral (snd timeTodo) * ruleTime) -- Display information time = liftA2 (/) todo donePerSec perc = iff ((==) 0 <$> done) (pure 0) $ liftA2' done todo $ \done todo -> 100 * done / (done + todo) --------------------------------------------------------------------- -- EXPOSED FUNCTIONS -- | Given a sampling interval (in seconds) and a way to display the status message, -- produce a function suitable for using as 'Development.Shake.shakeProgress'. -- This function polls the progress information every /n/ seconds, produces a status -- message and displays it using the display function. -- -- Typical status messages will take the form of @1m25s (15%)@, indicating that the build -- is predicted to complete in 1 minute 25 seconds (85 seconds total), and 15% of the necessary build time has elapsed. -- This function uses past observations to predict future behaviour, and as such, is only -- guessing. The time is likely to go up as well as down, and will be less accurate from a -- clean build (as the system has fewer past observations). -- -- The current implementation is to predict the time remaining (based on 'timeTodo') and the -- work already done ('timeBuilt'). The percentage is then calculated as @remaining / (done + remaining)@, -- while time left is calculated by scaling @remaining@ by the observed work rate in this build, -- roughly @done / time_elapsed@. progressDisplay :: Double -> (String -> IO ()) -> IO Progress -> IO () progressDisplay sample disp prog = do disp "Starting..." -- no useful info at this stage time <- offsetTime catchJust (\x -> if x == ThreadKilled then Just () else Nothing) (loop time $ message echoMealy) (const $ disp "Finished") where loop :: IO Double -> Mealy (Double, Progress) (Double, Double, String) -> IO () loop time mealy = do sleep sample p <- prog t <- time ((secs,perc,debug), mealy) <- return $ runMealy mealy (t, p) -- putStrLn debug disp $ formatMessage secs perc ++ maybe "" (\err -> ", Failure! " ++ err) (isFailure p) loop time mealy data ProgressEntry = ProgressEntry {idealSecs :: Double, idealPerc :: Double ,actualSecs :: Double, actualPerc :: Double } isInvalid :: ProgressEntry -> Bool isInvalid ProgressEntry{..} = isNaN actualSecs || isNaN actualPerc -- | Given a list of progress inputs, what would you have suggested (seconds, percentage) progressReplay :: [(Double, Progress)] -> [ProgressEntry] progressReplay [] = [] progressReplay ps = snd $ mapAccumL f (message echoMealy) ps where end = fst $ last ps f a (time,p) = (a2, ProgressEntry (end - time) (time * 100 / end) secs perc) where ((secs,perc,_),a2) = runMealy a (time,p) -- | Given a trace, display information about how well we did writeProgressReport :: FilePath -> [(FilePath, [(Double, Progress)])] -> IO () writeProgressReport out (map (second progressReplay) -> xs) | (bad,_):_ <- filter (any isInvalid . snd) xs = errorIO $ "Progress generates NaN for " ++ bad | takeExtension out == ".js" = writeFile out $ "var shake = \n" ++ generateJSON xs | takeExtension out == ".json" = writeFile out $ generateJSON xs | out == "-" = putStr $ unlines $ generateSummary xs | otherwise = LBS.writeFile out =<< generateHTML xs generateSummary :: [(FilePath, [ProgressEntry])] -> [String] generateSummary xs = flip concatMap xs $ \(file,xs) -> ["# " ++ file, f xs "Seconds" idealSecs actualSecs, f xs "Percent" idealPerc actualPerc] where levels = [100,90,80,50] f xs lbl ideal actual = lbl ++ ": " ++ intercalate ", " [show l ++ "% within " ++ show (ceiling $ maximum $ 0 : take ((length xs * l) `div` 100) diff) | l <- levels] where diff = sort [abs $ ideal x - actual x | x <- xs] generateHTML :: [(FilePath, [ProgressEntry])] -> IO LBS.ByteString generateHTML xs = do report <- readDataFileHTML "progress.html" let f name | name == "progress-data.js" = return $ LBS.pack $ "var progress =\n" ++ generateJSON xs | name == "version.js" = return $ LBS.pack $ "var version = " ++ show shakeVersionString | otherwise = readDataFileHTML name runTemplate f report generateJSON :: [(FilePath, [ProgressEntry])] -> String generateJSON = concat . jsonList . map ((++"}") . unlines . f) where f (file,ps) = ("{\"name\":" ++ show (takeFileName file) ++ ", \"values\":") : indent (jsonList $ map g ps) shw = showDP 1 g ProgressEntry{..} = jsonObject [("idealSecs",shw idealSecs),("idealPerc",shw idealPerc) ,("actualSecs",shw actualSecs),("actualPerc",shw actualPerc)] indent = map (" "++) jsonList xs = zipWith (:) ('[':repeat ',') xs ++ ["]"] jsonObject xs = "{" ++ intercalate ", " [show a ++ ":" ++ b | (a,b) <- xs] ++ "}" {-# NOINLINE xterm #-} xterm :: Bool xterm = unsafePerformIO $ -- Terminal.app uses "xterm-256color" as its env variable maybe False ("xterm" `isPrefixOf`) <$> lookupEnv "TERM" -- | Set the title of the current console window to the given text. If the -- environment variable @$TERM@ is set to @xterm@ this uses xterm escape sequences. -- On Windows, if not detected as an xterm, this function uses the @SetConsoleTitle@ API. progressTitlebar :: String -> IO () progressTitlebar x | xterm = BS.putStr $ BS.pack $ "\ESC]0;" ++ x ++ "\BEL" #ifdef mingw32_HOST_OS | otherwise = BS.useAsCString (BS.pack x) $ \x -> c_setConsoleTitle x >> return () #else | otherwise = return () #endif -- | Call the program @shake-progress@ if it is on the @$PATH@. The program is called with -- the following arguments: -- -- * @--title=string@ - the string passed to @progressProgram@. -- -- * @--state=Normal@, or one of @NoProgress@, @Normal@, or @Error@ to indicate -- what state the progress bar should be in. -- -- * @--value=25@ - the percent of the build that has completed, if not in @NoProgress@ state. -- -- The program will not be called consecutively with the same @--state@ and @--value@ options. -- -- Windows 7 or higher users can get taskbar progress notifications by placing the following -- program in their @$PATH@: . progressProgram :: IO (String -> IO ()) progressProgram = do exe <- findExecutable "shake-progress" case exe of Nothing -> return $ const $ return () Just exe -> do ref <- newIORef Nothing return $ \msg -> do let failure = " Failure! " `isInfixOf` msg let perc = let (a,b) = break (== '%') msg in if null b then "" else reverse $ takeWhile isDigit $ reverse a let key = (failure, perc) same <- atomicModifyIORef ref $ \old -> (Just key, old == Just key) let state | perc == "" = "NoProgress" | failure = "Error" | otherwise = "Normal" rawSystem exe $ ["--title=" ++ msg, "--state=" ++ state] ++ ["--value=" ++ perc | perc /= ""] return () -- | A simple method for displaying progress messages, suitable for using as 'Development.Shake.shakeProgress'. -- This function writes the current progress to the titlebar every five seconds using 'progressTitlebar', -- and calls any @shake-progress@ program on the @$PATH@ using 'progressProgram'. progressSimple :: IO Progress -> IO () progressSimple p = do program <- progressProgram progressDisplay 5 (\s -> progressTitlebar s >> program s) p shake-0.16.4/src/Development/Shake/Internal/Profile.hs0000644000000000000000000001107613261223301020675 0ustar0000000000000000{-# LANGUAGE PatternGuards, RecordWildCards #-} module Development.Shake.Internal.Profile(ProfileEntry(..), ProfileTrace(..), writeProfile) where import General.Template import Data.Tuple.Extra import Data.Function import Data.List import System.FilePath import Numeric.Extra import General.Extra import Development.Shake.Internal.Paths import System.Time.Extra import qualified Data.ByteString.Lazy.Char8 as LBS data ProfileEntry = ProfileEntry {prfName :: String, prfBuilt :: Int, prfChanged :: Int, prfDepends :: [Int], prfExecution :: Double, prfTraces :: [ProfileTrace]} data ProfileTrace = ProfileTrace {prfCommand :: String, prfStart :: Double, prfStop :: Double} prfTime ProfileTrace{..} = prfStop - prfStart -- | Generates an report given some build system profiling data. writeProfile :: FilePath -> [ProfileEntry] -> IO () writeProfile out xs | takeExtension out == ".js" = writeFile out $ "var shake = \n" ++ generateJSON xs | takeExtension out == ".json" = writeFile out $ generateJSON xs | takeExtension out == ".trace" = writeFile out $ generateTrace xs | out == "-" = putStr $ unlines $ generateSummary xs -- NOTE: On my laptop writing 1.5Mb of profile report takes 0.6s. -- This is fundamentals of my laptop, not a Haskell profiling issue. -- Verified with similar "type foo > bar" commands taking similar time. | otherwise = LBS.writeFile out =<< generateHTML xs generateSummary :: [ProfileEntry] -> [String] generateSummary xs = ["* This database has tracked " ++ show (maximum (0 : map prfChanged xs) + 1) ++ " runs." ,let f = show . length in "* There are " ++ f xs ++ " rules (" ++ f ls ++ " rebuilt in the last run)." ,let f = show . sum . map (length . prfTraces) in "* Building required " ++ f xs ++ " traced commands (" ++ f ls ++ " in the last run)." ,"* The total (unparallelised) time is " ++ showDuration (sum $ map prfExecution xs) ++ " of which " ++ showDuration (sum $ map prfTime $ concatMap prfTraces xs) ++ " is traced commands." ,let f xs = if null xs then "0s" else (\(a,b) -> showDuration a ++ " (" ++ b ++ ")") $ maximumBy' (compare `on` fst) xs in "* The longest rule takes " ++ f (map (prfExecution &&& prfName) xs) ++ ", and the longest traced command takes " ++ f (map (prfTime &&& prfCommand) $ concatMap prfTraces xs) ++ "." ,let sumLast = sum $ map prfTime $ concatMap prfTraces ls maxStop = maximum $ 0 : map prfStop (concatMap prfTraces ls) in "* Last run gave an average parallelism of " ++ showDP 2 (if maxStop == 0 then 0 else sumLast / maxStop) ++ " times over " ++ showDuration maxStop ++ "." ] where ls = filter ((==) 0 . prfBuilt) xs generateHTML :: [ProfileEntry] -> IO LBS.ByteString generateHTML xs = do report <- readDataFileHTML "profile.html" let f name | name == "profile-data.js" = return $ LBS.pack $ "var profile =\n" ++ generateJSON xs | name == "version.js" = return $ LBS.pack $ "var version = " ++ show shakeVersionString | otherwise = readDataFileHTML name runTemplate f report generateTrace :: [ProfileEntry] -> String generateTrace xs = jsonListLines $ showEntries 0 [y{prfCommand=prfName x} | x <- xs, y <- prfTraces x] ++ showEntries 1 (concatMap prfTraces xs) where showEntries pid xs = map (showEntry pid) $ snd $ mapAccumL alloc [] $ sortBy (compare `on` prfStart) xs alloc as r | (a1,an:a2) <- break (\a -> prfStop a <= prfStart r) as = (a1++r:a2, (length a1,r)) | otherwise = (as++[r], (length as,r)) showEntry pid (tid, ProfileTrace{..}) = jsonObject [("args","{}"), ("ph",show "X"), ("cat",show "target") ,("name",show prfCommand), ("tid",show tid), ("pid",show pid) ,("ts",show $ 1000000*prfStart), ("dur",show $ 1000000*(prfStop-prfStart))] generateJSON :: [ProfileEntry] -> String generateJSON = jsonListLines . map showEntry where showEntry ProfileEntry{..} = jsonObject $ [("name", show prfName) ,("built", show prfBuilt) ,("changed", show prfChanged) ,("depends", show prfDepends) ,("execution", showDP 4 prfExecution)] ++ [("traces", jsonList $ map showTrace prfTraces) | not $ null prfTraces] showTrace ProfileTrace{..} = jsonObject [("command",show prfCommand), ("start",show prfStart), ("stop",show prfStop)] jsonListLines xs = "[" ++ intercalate "\n," xs ++ "\n]" jsonList xs = "[" ++ intercalate "," xs ++ "]" jsonObject xs = "{" ++ intercalate "," [show a ++ ":" ++ b | (a,b) <- xs] ++ "}" shake-0.16.4/src/Development/Shake/Internal/Paths.hs0000644000000000000000000000422313261223301020350 0ustar0000000000000000 -- | The information from Paths_shake cleaned up module Development.Shake.Internal.Paths( shakeVersionString, initDataDirectory, hasManualData, copyManualData, readDataFileHTML ) where import Paths_shake import Control.Exception import Control.Monad.Extra import Data.Version import System.Directory import System.FilePath import System.Info.Extra import System.IO.Unsafe import System.Environment.Extra import General.Extra import Data.Functor import qualified Data.ByteString.Lazy as LBS import Prelude shakeVersionString :: String shakeVersionString = showVersion version -- We want getDataFileName to be relative to the current directory on program startup, -- even if we issue a change directory command. Therefore, first call caches, future ones read. {-# NOINLINE dataDirs #-} dataDirs :: [String] dataDirs = unsafePerformIO $ do datdir <- getDataDir exedir <- takeDirectory <$> getExecutablePath `catchIO` \_ -> return "" curdir <- getCurrentDirectory return $ [datdir] ++ [exedir | exedir /= ""] ++ [curdir] -- The data files may be located relative to the current directory, if so cache it in advance initDataDirectory :: IO () initDataDirectory = void $ evaluate dataDirs getDataFile :: FilePath -> IO FilePath getDataFile file = do let poss = map ( file) dataDirs res <- filterM doesFileExist_ poss case res of [] -> fail $ unlines $ ("Could not find data file " ++ file ++ ", looked in:") : map (" " ++) poss x:_ -> return x hasDataFile :: FilePath -> IO Bool hasDataFile file = anyM (\dir -> doesFileExist_ $ dir file) dataDirs readDataFileHTML :: FilePath -> IO LBS.ByteString readDataFileHTML file = LBS.readFile =<< getDataFile ("html" file) manualFiles :: [FilePath] manualFiles = map ("docs/manual" ) ["Build.hs","main.c","constants.c","constants.h","build" <.> if isWindows then "bat" else "sh"] hasManualData :: IO Bool hasManualData = allM hasDataFile manualFiles copyManualData :: FilePath -> IO () copyManualData dest = do createDirectoryRecursive dest forM_ manualFiles $ \file -> do src <- getDataFile file copyFile src (dest takeFileName file) shake-0.16.4/src/Development/Shake/Internal/Options.hs0000644000000000000000000003762713261223301020742 0ustar0000000000000000{-# LANGUAGE DeriveDataTypeable, PatternGuards #-} -- | Types exposed to the user module Development.Shake.Internal.Options( Progress(..), Verbosity(..), Rebuild(..), Lint(..), Change(..), ShakeOptions(..), shakeOptions, -- Internal stuff shakeRebuildApply, shakeAbbreviationsApply ) where import Data.Data import Data.List.Extra import Data.Tuple.Extra import Data.Maybe import Data.Dynamic import qualified Data.HashMap.Strict as Map import Development.Shake.Internal.Progress import Development.Shake.Internal.FilePattern import qualified Data.ByteString.Char8 as BS import qualified Data.ByteString.UTF8 as UTF8 import Development.Shake.Internal.CmdOption -- | The current assumptions made by the build system, used by 'shakeRebuild'. These options -- allow the end user to specify that any rules run are either to be treated as clean, or as -- dirty, regardless of what the build system thinks. -- -- These assumptions only operate on files reached by the current 'Development.Shake.action' commands. Any -- other files in the database are left unchanged. data Rebuild = RebuildNow -- ^ Assume these files are dirty and require rebuilding. -- for benchmarking rebuild speed and for rebuilding if untracked dependencies have changed. -- This flag is safe, but may cause more rebuilding than necessary. | RebuildNormal -- ^ Useful to reset the rebuild status to how it was before, equivalent to passing no 'Rebuild' flags. | RebuildLater -- ^ /This assumption is unsafe, and may lead to incorrect build results in this run/. -- Assume these files are clean in this run, but test them normally in future runs. {- | RebuildNever -- Add to RebuildNow: Useful to undo the results of 'RebuildNever', -- ^ /This assumption is unsafe, and may lead to incorrect build results in this run, and in future runs/. -- Assume and record that these files are clean and do not require rebuilding, provided the file -- has been built before. Useful if you have modified a file in some -- inconsequential way, such as only the comments or whitespace, and wish to avoid a rebuild. -} deriving (Eq,Ord,Show,Read,Typeable,Data,Enum,Bounded) -- | Which lint checks to perform, used by 'shakeLint'. data Lint = LintBasic -- ^ The most basic form of linting. Checks that the current directory does not change and that results do not change after they -- are first written. Any calls to 'needed' will assert that they do not cause a rule to be rebuilt. | LintFSATrace -- ^ Track which files are accessed by command line programs -- using . deriving (Eq,Ord,Show,Read,Typeable,Data,Enum,Bounded) -- | How should you determine if a file has changed, used by 'shakeChange'. The most common values are -- 'ChangeModtime' (the default, very fast, @touch@ causes files to rebuild) and 'ChangeModtimeAndDigestInput' -- (slightly slower, @touch@ and switching @git@ branches does not cause input files to rebuild). data Change = ChangeModtime -- ^ Compare equality of modification timestamps, a file has changed if its last modified time changes. -- A @touch@ will force a rebuild. This mode is fast and usually sufficiently accurate, so is the default. | ChangeDigest -- ^ Compare equality of file contents digests, a file has changed if its digest changes. -- A @touch@ will not force a rebuild. Use this mode if modification times on your file system are unreliable. | ChangeModtimeAndDigest -- ^ A file is rebuilt if both its modification time and digest have changed. For efficiency reasons, the modification -- time is checked first, and if that has changed, the digest is checked. | ChangeModtimeAndDigestInput -- ^ Use 'ChangeModtimeAndDigest' for input\/source files and 'ChangeModtime' for output files. -- An input file is one which is a dependency but is not built by Shake as it has no -- matching rule and already exists on the file system. | ChangeModtimeOrDigest -- ^ A file is rebuilt if either its modification time or its digest has changed. A @touch@ will force a rebuild, -- but even if a files modification time is reset afterwards, changes will also cause a rebuild. deriving (Eq,Ord,Show,Read,Typeable,Data,Enum,Bounded) -- | Options to control the execution of Shake, usually specified by overriding fields in -- 'shakeOptions': -- -- @ 'shakeOptions'{'shakeThreads'=4, 'shakeReport'=[\"report.html\"]} @ -- -- The 'Data' instance for this type reports the 'shakeProgress' and 'shakeOutput' fields as having the abstract type 'Hidden', -- because 'Data' cannot be defined for functions or 'TypeRep's. data ShakeOptions = ShakeOptions {shakeFiles :: FilePath -- ^ Defaults to @.shake@. The directory used for storing Shake metadata files. -- All metadata files will be named @'shakeFiles'\/.shake./file-name/@, for some @/file-name/@. -- If the 'shakeFiles' directory does not exist it will be created. ,shakeThreads :: Int -- ^ Defaults to @1@. Maximum number of rules to run in parallel, similar to @make --jobs=/N/@. -- For many build systems, a number equal to or slightly less than the number of physical processors -- works well. Use @0@ to match the detected number of processors (when @0@, 'getShakeOptions' will -- return the number of threads used). ,shakeVersion :: String -- ^ Defaults to @"1"@. The version number of your build rules. -- Change the version number to force a complete rebuild, such as when making -- significant changes to the rules that require a wipe. The version number should be -- set in the source code, and not passed on the command line. ,shakeVerbosity :: Verbosity -- ^ Defaults to 'Normal'. What level of messages should be printed out. ,shakeStaunch :: Bool -- ^ Defaults to 'False'. Operate in staunch mode, where building continues even after errors, -- similar to @make --keep-going@. ,shakeReport :: [FilePath] -- ^ Defaults to @[]@. Write a profiling report to a file, showing which rules rebuilt, -- why, and how much time they took. Useful for improving the speed of your build systems. -- If the file extension is @.json@ it will write JSON data; if @.js@ it will write Javascript; -- if @.trace@ it will write trace events (load into @about:\/\/tracing@ in Chrome); -- otherwise it will write HTML. ,shakeLint :: Maybe Lint -- ^ Defaults to 'Nothing'. Perform sanity checks during building, see 'Lint' for details. ,shakeLintInside :: [FilePath] -- ^ Directories in which the files will be tracked by the linter. ,shakeLintIgnore :: [FilePattern] -- ^ File patterns which are ignored from linter tracking, a bit like calling 'Development.Shake.trackAllow' in every rule. ,shakeCommandOptions :: [CmdOption] -- ^ Defaults to @[]@. Additional options to be passed to all command invocations. ,shakeFlush :: Maybe Double -- ^ Defaults to @'Just' 10@. How often to flush Shake metadata files in seconds, or 'Nothing' to never flush explicitly. -- It is possible that on abnormal termination (not Haskell exceptions) any rules that completed in the last -- 'shakeFlush' seconds will be lost. ,shakeRebuild :: [(Rebuild, FilePattern)] -- ^ What to rebuild ,shakeAbbreviations :: [(String,String)] -- ^ Defaults to @[]@. A list of substrings that should be abbreviated in status messages, and their corresponding abbreviation. -- Commonly used to replace the long paths (e.g. @.make\/i586-linux-gcc\/output@) with an abbreviation (e.g. @$OUT@). ,shakeStorageLog :: Bool -- ^ Defaults to 'False'. Write a message to @'shakeFiles'\/.shake.storage.log@ whenever a storage event happens which may impact -- on the current stored progress. Examples include database version number changes, database compaction or corrupt files. ,shakeLineBuffering :: Bool -- ^ Defaults to 'True'. Change 'stdout' and 'stderr' to line buffering while running Shake. ,shakeTimings :: Bool -- ^ Defaults to 'False'. Print timing information for each stage at the end. ,shakeRunCommands :: Bool -- ^ Default to 'True'. Should you run command line actions, set to 'False' to skip actions whose output streams and exit code -- are not used. Useful for profiling the non-command portion of the build system. ,shakeChange :: Change -- ^ Default to 'ChangeModtime'. How to check if a file has changed, see 'Change' for details. ,shakeCreationCheck :: Bool -- ^ Default to 'True'. After running a rule to create a file, is it an error if the file does not exist. -- Provided for compatibility with @make@ and @ninja@ (which have ugly file creation semantics). -- ,shakeOutputCheck :: Bool -- -- ^ Default to 'True'. If a file produced by a rule changes, should you rebuild it. ,shakeLiveFiles :: [FilePath] -- ^ Default to @[]@. After the build system completes, write a list of all files which were /live/ in that run, -- i.e. those which Shake checked were valid or rebuilt. Produces best answers if nothing rebuilds. ,shakeVersionIgnore :: Bool -- ^ Defaults to 'False'. Ignore any differences in 'shakeVersion'. ,shakeColor :: Bool -- ^ Defaults to 'False'. Whether to colorize the output. ,shakeProgress :: IO Progress -> IO () -- ^ Defaults to no action. A function called when the build starts, allowing progress to be reported. -- The function is called on a separate thread, and that thread is killed when the build completes. -- For applications that want to display progress messages, 'progressSimple' is often sufficient, but more advanced -- users should look at the 'Progress' data type. ,shakeOutput :: Verbosity -> String -> IO () -- ^ Defaults to writing using 'putStrLn'. A function called to output messages from Shake, along with the 'Verbosity' at -- which that message should be printed. This function will be called atomically from all other 'shakeOutput' functions. -- The 'Verbosity' will always be greater than or higher than 'shakeVerbosity'. ,shakeExtra :: Map.HashMap TypeRep Dynamic -- ^ This a map which can be used to store arbitrary extra information that a user may need when writing rules. -- The key of each entry must be the 'dynTypeRep' of the value. -- Insert values using 'addShakeExtra' and retrieve them using 'getShakeExtra'. -- The correct way to use this field is to define a hidden newtype for the key, so that conflicts cannot occur. } deriving Typeable -- | The default set of 'ShakeOptions'. shakeOptions :: ShakeOptions shakeOptions = ShakeOptions ".shake" 1 "1" Normal False [] Nothing [] [] [] (Just 10) [] [] False True False True ChangeModtime True [] False False (const $ return ()) (const $ BS.putStrLn . UTF8.fromString) -- try and output atomically using BS Map.empty fieldsShakeOptions = ["shakeFiles", "shakeThreads", "shakeVersion", "shakeVerbosity", "shakeStaunch", "shakeReport" ,"shakeLint", "shakeLintInside", "shakeLintIgnore", "shakeCommandOptions" ,"shakeFlush", "shakeRebuild", "shakeAbbreviations", "shakeStorageLog" ,"shakeLineBuffering", "shakeTimings", "shakeRunCommands", "shakeChange", "shakeCreationCheck" ,"shakeLiveFiles","shakeVersionIgnore","shakeProgress", "shakeOutput", "shakeColor", "shakeExtra"] tyShakeOptions = mkDataType "Development.Shake.Types.ShakeOptions" [conShakeOptions] conShakeOptions = mkConstr tyShakeOptions "ShakeOptions" fieldsShakeOptions Prefix unhide x1 x2 x3 x4 x5 x6 x7 x8 x9 x10 x11 x12 x13 x14 x15 x16 x17 x18 x19 x20 x21 x22 y1 y2 y3 = ShakeOptions x1 x2 x3 x4 x5 x6 x7 x8 x9 x10 x11 x12 x13 x14 x15 x16 x17 x18 x19 x20 x21 x22 (fromHidden y1) (fromHidden y2) (fromHidden y3) instance Data ShakeOptions where gfoldl k z (ShakeOptions x1 x2 x3 x4 x5 x6 x7 x8 x9 x10 x11 x12 x13 x14 x15 x16 x17 x18 x19 x20 x21 x22 y1 y2 y3) = z unhide `k` x1 `k` x2 `k` x3 `k` x4 `k` x5 `k` x6 `k` x7 `k` x8 `k` x9 `k` x10 `k` x11 `k` x12 `k` x13 `k` x14 `k` x15 `k` x16 `k` x17 `k` x18 `k` x19 `k` x20 `k` x21 `k` x22 `k` Hidden y1 `k` Hidden y2 `k` Hidden y3 gunfold k z c = k $ k $ k $ k $ k $ k $ k $ k $ k $ k $ k $ k $ k $ k $ k $ k $ k $ k $ k $ k $ k $ k $ k $ k $ k $ z unhide toConstr ShakeOptions{} = conShakeOptions dataTypeOf _ = tyShakeOptions instance Show ShakeOptions where show x = "ShakeOptions {" ++ intercalate ", " inner ++ "}" where inner = zipWith (\x y -> x ++ " = " ++ y) fieldsShakeOptions $ gmapQ f x f x | Just x <- cast x = show (x :: Int) | Just x <- cast x = show (x :: FilePath) | Just x <- cast x = show (x :: Verbosity) | Just x <- cast x = show (x :: Change) | Just x <- cast x = show (x :: Bool) | Just x <- cast x = show (x :: [FilePath]) | Just x <- cast x = show (x :: [(Rebuild, FilePattern)]) | Just x <- cast x = show (x :: Maybe Lint) | Just x <- cast x = show (x :: Maybe Double) | Just x <- cast x = show (x :: [(String,String)]) | Just x <- cast x = show (x :: Hidden (IO Progress -> IO ())) | Just x <- cast x = show (x :: Hidden (Verbosity -> String -> IO ())) | Just x <- cast x = show (x :: Hidden (Map.HashMap TypeRep Dynamic)) | Just x <- cast x = show (x :: [CmdOption]) | otherwise = error $ "Error while showing ShakeOptions, missing alternative for " ++ show (typeOf x) -- | Internal type, copied from Hide in Uniplate newtype Hidden a = Hidden {fromHidden :: a} deriving Typeable instance Show (Hidden a) where show _ = "" instance Typeable a => Data (Hidden a) where gfoldl k z = z gunfold k z c = error "Development.Shake.Types.ShakeProgress: gunfold not implemented - data type has no constructors" toConstr _ = error "Development.Shake.Types.ShakeProgress: toConstr not implemented - data type has no constructors" dataTypeOf _ = tyHidden tyHidden = mkDataType "Development.Shake.Types.Hidden" [] -- | The verbosity data type, used by 'shakeVerbosity'. data Verbosity = Silent -- ^ Don't print any messages. | Quiet -- ^ Only print essential messages, typically errors. | Normal -- ^ Print errors and @# /command-name/ (for /file-name/)@ when running a 'Development.Shake.traced' command. | Loud -- ^ Print errors and full command lines when running a 'Development.Shake.command' or 'Development.Shake.cmd' command. | Chatty -- ^ Print errors, full command line and status messages when starting a rule. | Diagnostic -- ^ Print messages for virtually everything (mostly for debugging). deriving (Eq,Ord,Show,Read,Typeable,Data,Enum,Bounded) -- | Apply the 'shakeRebuild' flags to a file, determining the desired behaviour shakeRebuildApply :: ShakeOptions -> (FilePath -> Rebuild) shakeRebuildApply ShakeOptions{shakeRebuild=rs} | null rs = const RebuildNormal | otherwise = \x -> fromMaybe RebuildNormal $ firstJust (\(r,pat) -> if pat x then Just r else Nothing) rs2 where rs2 = map (second (?==)) $ reverse rs shakeAbbreviationsApply :: ShakeOptions -> String -> String shakeAbbreviationsApply ShakeOptions{shakeAbbreviations=abbrev} | null abbrev = id | otherwise = f where -- order so longer abbreviations are preferred ordAbbrev = sortOn (negate . length . fst) abbrev f [] = [] f x | (to,rest):_ <- [(to,rest) | (from,to) <- ordAbbrev, Just rest <- [stripPrefix from x]] = to ++ f rest f (x:xs) = x : f xs shake-0.16.4/src/Development/Shake/Internal/FilePattern.hs0000644000000000000000000002756113261223301021520 0ustar0000000000000000{-# LANGUAGE PatternGuards, ViewPatterns #-} module Development.Shake.Internal.FilePattern( -- * Primitive API, as exposed FilePattern, (?==), (), -- * General API, used by other people. filePattern, -- * Optimisation opportunities simple, -- * Multipattern file rules compatible, extract, substitute, -- * Accelerated searching Walk(..), walk, -- * Testing only internalTest, isRelativePath, isRelativePattern ) where import Development.Shake.Internal.Errors import System.FilePath(isPathSeparator) import Data.List.Extra import Control.Applicative import Control.Monad import Data.Char import Data.Tuple.Extra import Data.Maybe import System.Info.Extra import Prelude -- | A type synonym for file patterns, containing @\/\/@ and @*@. For the syntax -- and semantics of 'FilePattern' see '?=='. -- -- Most 'normaliseEx'd 'FilePath' values are suitable as 'FilePattern' values which match -- only that specific file. On Windows @\\@ is treated as equivalent to @\/@. -- -- You can write 'FilePattern' values as a literal string, or build them -- up using the operators 'Development.Shake.FilePath.<.>', 'Development.Shake.FilePath.' -- and 'Development.Shake.'. However, beware that: -- -- * On Windows, use 'Development.Shake.FilePath.<.>' from "Development.Shake.FilePath" instead of from -- "System.FilePath" - otherwise @\"\/\/*\" \<.\> exe@ results in @\"\/\/*\\\\.exe\"@. -- -- * If the second argument of 'Development.Shake.FilePath.' has a leading path separator (namely @\/@) -- then the second argument will be returned. type FilePattern = String infixr 5 -- | Join two 'FilePattern' values by inserting two @\/@ characters between them. -- Will first remove any trailing path separators on the first argument, and any leading -- separators on the second. -- -- > "dir" "*" == "dir//*" () :: FilePattern -> FilePattern -> FilePattern a b = dropWhileEnd isPathSeparator a ++ "//" ++ dropWhile isPathSeparator b --------------------------------------------------------------------- -- PATTERNS data Pat = Lit String -- ^ foo | Star -- ^ /*/ | Skip -- ^ // | Skip1 -- ^ //, but must be at least 1 element | Stars String [String] String -- ^ *foo*, prefix (fixed), infix floaters, suffix -- e.g. *foo*bar = Stars "" ["foo"] "bar" deriving (Show,Eq,Ord) isLit Lit{} = True; isLit _ = False fromLit (Lit x) = x data Lexeme = Str String | Slash | SlashSlash lexer :: FilePattern -> [Lexeme] lexer "" = [] lexer (x1:x2:xs) | isPathSeparator x1, isPathSeparator x2 = SlashSlash : lexer xs lexer (x1:xs) | isPathSeparator x1 = Slash : lexer xs lexer xs = Str a : lexer b where (a,b) = break isPathSeparator xs -- | Parse a FilePattern. All optimisations I can think of are invalid because they change the extracted expressions. parse :: FilePattern -> [Pat] parse = f False True . lexer where -- str = I have ever seen a Str go past (equivalent to "can I be satisfied by no paths") -- slash = I am either at the start, or my previous character was Slash f str slash [] = [Lit "" | slash] f str slash (Str "**":xs) = Skip : f True False xs f str slash (Str x:xs) = parseLit x : f True False xs f str slash (SlashSlash:Slash:xs) | not str = Skip1 : f str True xs f str slash (SlashSlash:xs) = Skip : f str False xs f str slash (Slash:xs) = [Lit "" | not str] ++ f str True xs parseLit :: String -> Pat parseLit "*" = Star parseLit x = case split (== '*') x of [x] -> Lit x pre:xs | Just (mid,post) <- unsnoc xs -> Stars pre mid post internalTest :: IO () internalTest = do let x # y = when (parse x /= y) $ fail $ show ("FilePattern.internalTest",x,parse x,y) "" # [Lit ""] "x" # [Lit "x"] "/" # [Lit "",Lit ""] "x/" # [Lit "x",Lit ""] "/x" # [Lit "",Lit "x"] "x/y" # [Lit "x",Lit "y"] "//" # [Skip] "**" # [Skip] "//x" # [Skip, Lit "x"] "**/x" # [Skip, Lit "x"] "x//" # [Lit "x", Skip] "x/**" # [Lit "x", Skip] "x//y" # [Lit "x",Skip, Lit "y"] "x/**/y" # [Lit "x",Skip, Lit "y"] "///" # [Skip1, Lit ""] "**/**" # [Skip,Skip] "**/**/" # [Skip, Skip, Lit ""] "///x" # [Skip1, Lit "x"] "**/x" # [Skip, Lit "x"] "x///" # [Lit "x", Skip, Lit ""] "x/**/" # [Lit "x", Skip, Lit ""] "x///y" # [Lit "x",Skip, Lit "y"] "x/**/y" # [Lit "x",Skip, Lit "y"] "////" # [Skip, Skip] "**/**/**" # [Skip, Skip, Skip] "////x" # [Skip, Skip, Lit "x"] "x////" # [Lit "x", Skip, Skip] "x////y" # [Lit "x",Skip, Skip, Lit "y"] "**//x" # [Skip, Skip, Lit "x"] -- | Optimisations that may change the matched expressions optimise :: [Pat] -> [Pat] optimise (Skip:Skip:xs) = optimise $ Skip:xs optimise (Skip:Star:xs) = optimise $ Skip1:xs optimise (Star:Skip:xs) = optimise $ Skip1:xs optimise (x:xs) = x : optimise xs optimise [] =[] -- | A 'FilePattern' that will only match 'isRelativePath' values. isRelativePattern :: FilePattern -> Bool isRelativePattern ('*':'*':xs) | [] <- xs = True | x:xs <- xs, isPathSeparator x = True isRelativePattern _ = False -- | A non-absolute 'FilePath'. isRelativePath :: FilePath -> Bool isRelativePath (x:_) | isPathSeparator x = False isRelativePath (x:':':_) | isWindows, isAlpha x = False isRelativePath _ = True -- | Given a pattern, and a list of path components, return a list of all matches -- (for each wildcard in order, what the wildcard matched). match :: [Pat] -> [String] -> [[String]] match (Skip:xs) (y:ys) = map ("":) (match xs (y:ys)) ++ match (Skip1:xs) (y:ys) match (Skip1:xs) (y:ys) = [(y++"/"++r):rs | r:rs <- match (Skip:xs) ys] match (Skip:xs) [] = map ("":) $ match xs [] match (Star:xs) (y:ys) = map (y:) $ match xs ys match (Lit x:xs) (y:ys) = concat $ [match xs ys | x == y] ++ [match xs (y:ys) | x == "."] match (x@Stars{}:xs) (y:ys) | Just rs <- matchStars x y = map (rs ++) $ match xs ys match [] [] = [[]] match _ _ = [] matchOne :: Pat -> String -> Bool matchOne (Lit x) y = x == y matchOne x@Stars{} y = isJust $ matchStars x y matchOne Star _ = True -- Only return the first (all patterns left-most) valid star matching matchStars :: Pat -> String -> Maybe [String] matchStars (Stars pre mid post) x = do x <- stripPrefix pre x x <- if null post then Just x else stripSuffix post x stripInfixes mid x where stripInfixes [] x = Just [x] stripInfixes (m:ms) x = do (a,x) <- stripInfix m x (a:) <$> stripInfixes ms x -- | Match a 'FilePattern' against a 'FilePath', There are three special forms: -- -- * @*@ matches an entire path component, excluding any separators. -- -- * @\/\/@ matches an arbitrary number of path components, including absolute path -- prefixes. -- -- * @**@ as a path component matches an arbitrary number of path components, but not -- absolute path prefixes. -- Currently considered experimental. -- -- Some examples: -- -- * @test.c@ matches @test.c@ and nothing else. -- -- * @*.c@ matches all @.c@ files in the current directory, so @file.c@ matches, -- but @file.h@ and @dir\/file.c@ don't. -- -- * @\/\/*.c@ matches all @.c@ files anywhere on the filesystem, -- so @file.c@, @dir\/file.c@, @dir1\/dir2\/file.c@ and @\/path\/to\/file.c@ all match, -- but @file.h@ and @dir\/file.h@ don't. -- -- * @dir\/*\/*@ matches all files one level below @dir@, so @dir\/one\/file.c@ and -- @dir\/two\/file.h@ match, but @file.c@, @one\/dir\/file.c@, @dir\/file.h@ -- and @dir\/one\/two\/file.c@ don't. -- -- Patterns with constructs such as @foo\/..\/bar@ will never match -- normalised 'FilePath' values, so are unlikely to be correct. (?==) :: FilePattern -> FilePath -> Bool (?==) p = case optimise $ parse p of [x] | x == Skip || x == Skip1 -> if rp then isRelativePath else const True p -> let f = not . null . match p . split isPathSeparator in if rp then (\x -> isRelativePath x && f x) else f where rp = isRelativePattern p -- | Like '?==', but returns 'Nothing' on if there is no match, otherwise 'Just' with the list -- of fragments matching each wildcard. For example: -- -- @ -- 'filePattern' \"**\/*.c\" \"test.txt\" == Nothing -- 'filePattern' \"**\/*.c\" \"foo.c\" == Just [\"",\"foo\"] -- 'filePattern' \"**\/*.c\" \"bar\/baz\/foo.c\" == Just [\"bar\/baz/\",\"foo\"] -- @ -- -- Note that the @**@ will often contain a trailing @\/@, and even on Windows any -- @\\@ separators will be replaced by @\/@. filePattern :: FilePattern -> FilePath -> Maybe [String] filePattern p = \x -> if eq x then Just $ ex x else Nothing where eq = (?==) p ex = extract p --------------------------------------------------------------------- -- MULTIPATTERN COMPATIBLE SUBSTITUTIONS specials :: FilePattern -> [Pat] specials = concatMap f . parse where f Lit{} = [] f Star = [Star] f Skip = [Skip] f Skip1 = [Skip] f (Stars _ xs _) = replicate (length xs + 1) Star -- | Is the pattern free from any * and //. simple :: FilePattern -> Bool simple = null . specials -- | Do they have the same * and // counts in the same order compatible :: [FilePattern] -> Bool compatible [] = True compatible (x:xs) = all ((==) (specials x) . specials) xs -- | Extract the items that match the wildcards. The pair must match with '?=='. extract :: FilePattern -> FilePath -> [String] extract p = let pat = parse p in \x -> case match pat (split isPathSeparator x) of [] | p ?== x -> errorInternal $ "extract with " ++ show p ++ " and " ++ show x | otherwise -> error $ "Pattern " ++ show p ++ " does not match " ++ x ++ ", when trying to extract the FilePattern matches" ms:_ -> ms -- | Given the result of 'extract', substitute it back in to a 'compatible' pattern. -- -- > p '?==' x ==> substitute (extract p x) p == x substitute :: [String] -> FilePattern -> FilePath substitute oms oxs = intercalate "/" $ concat $ snd $ mapAccumL f oms (parse oxs) where f ms (Lit x) = (ms, [x]) f (m:ms) Star = (ms, [m]) f (m:ms) Skip = (ms, split m) f (m:ms) Skip1 = (ms, split m) f ms (Stars pre mid post) = (ms2, [concat $ pre : zipWith (++) ms1 (mid++[post])]) where (ms1,ms2) = splitAt (length mid + 1) ms f _ _ = error $ "Substitution failed into pattern " ++ show oxs ++ " with " ++ show (length oms) ++ " matches, namely " ++ show oms split = linesBy (== '/') --------------------------------------------------------------------- -- EFFICIENT PATH WALKING -- | Given a list of files, return a list of things I can match in this directory -- plus a list of subdirectories and walks that apply to them. -- Use WalkTo when the list can be predicted in advance data Walk = Walk ([String] -> ([String],[(String,Walk)])) | WalkTo ([String],[(String,Walk)]) walk :: [FilePattern] -> (Bool, Walk) walk ps = (any (\p -> isEmpty p || not (null $ match p [""])) ps2, f ps2) where ps2 = map (filter (/= Lit ".") . optimise . parse) ps f (nubOrd -> ps) | all isLit fin, all (isLit . fst) nxt = WalkTo (map fromLit fin, map (fromLit *** f) nxt) | otherwise = Walk $ \xs -> (if finStar then xs else filter (\x -> any (`matchOne` x) fin) xs ,[(x, f ys) | x <- xs, let ys = concat [b | (a,b) <- nxt, matchOne a x], not $ null ys]) where finStar = Star `elem` fin fin = nubOrd $ mapMaybe final ps nxt = groupSort $ concatMap next ps next :: [Pat] -> [(Pat, [Pat])] next (Skip1:xs) = [(Star,Skip:xs)] next (Skip:xs) = (Star,Skip:xs) : next xs next (x:xs) = [(x,xs) | not $ null xs] next [] = [] final :: [Pat] -> Maybe Pat final (Skip:xs) = if isEmpty xs then Just Star else final xs final (Skip1:xs) = if isEmpty xs then Just Star else Nothing final (x:xs) = if isEmpty xs then Just x else Nothing final [] = Nothing isEmpty = all (== Skip) shake-0.16.4/src/Development/Shake/Internal/FileName.hs0000644000000000000000000000521413261223301020752 0ustar0000000000000000{-# LANGUAGE GeneralizedNewtypeDeriving, FlexibleInstances #-} module Development.Shake.Internal.FileName( FileName, fileNameFromString, fileNameFromByteString, fileNameToString, fileNameToByteString, filepathNormalise ) where import qualified Data.ByteString.Char8 as BS import qualified Data.ByteString.UTF8 as UTF8 import Development.Shake.Classes import qualified System.FilePath as Native import General.Binary import System.Info.Extra import Data.List --------------------------------------------------------------------- -- Data.ByteString -- Mostly because ByteString does not have an NFData instance in GHC 7.4 -- | UTF8 ByteString newtype FileName = FileName BS.ByteString deriving (Hashable, Binary, BinaryEx, Eq) instance NFData FileName where rnf (FileName x) = x `seq` () instance Show FileName where show = fileNameToString instance BinaryEx [FileName] where putEx = putEx . map (\(FileName x) -> x) getEx = map FileName . getEx fileNameToString :: FileName -> FilePath fileNameToString = UTF8.toString . fileNameToByteString fileNameToByteString :: FileName -> BS.ByteString fileNameToByteString (FileName x) = x fileNameFromString :: FilePath -> FileName fileNameFromString = fileNameFromByteString . UTF8.fromString fileNameFromByteString :: BS.ByteString -> FileName fileNameFromByteString = FileName . filepathNormalise --------------------------------------------------------------------- -- NORMALISATION -- | Equivalent to @toStandard . normaliseEx@ from "Development.Shake.FilePath". filepathNormalise :: BS.ByteString -> BS.ByteString filepathNormalise xs | isWindows, Just (a,xs) <- BS.uncons xs, sep a, Just (b,_) <- BS.uncons xs, sep b = '/' `BS.cons` f xs | otherwise = f xs where sep = Native.isPathSeparator f o = deslash o $ BS.concat $ (slash:) $ intersperse slash $ reverse $ (BS.empty:) $ g 0 $ reverse $ split o deslash o x | x == slash = case (pre,pos) of (True,True) -> slash (True,False) -> BS.pack "/." (False,True) -> BS.pack "./" (False,False) -> dot | otherwise = (if pre then id else BS.tail) $ (if pos then id else BS.init) x where pre = not (BS.null o) && sep (BS.head o) pos = not (BS.null o) && sep (BS.last o) g i [] = replicate i dotDot g i (x:xs) | BS.null x = g i xs g i (x:xs) | x == dotDot = g (i+1) xs g i (x:xs) | x == dot = g i xs g 0 (x:xs) = x : g 0 xs g i (x:xs) = g (i-1) xs split = BS.splitWith sep dotDot = BS.pack ".." dot = BS.singleton '.' slash = BS.singleton '/' shake-0.16.4/src/Development/Shake/Internal/FileInfo.hs0000644000000000000000000001371513261223301020772 0ustar0000000000000000{-# LANGUAGE GeneralizedNewtypeDeriving, DeriveDataTypeable, CPP, ForeignFunctionInterface #-} module Development.Shake.Internal.FileInfo( FileInfo, fileInfoNoHash, FileSize, ModTime, FileHash, getFileHash, getFileInfo ) where import Control.Exception.Extra import Development.Shake.Classes import Development.Shake.Internal.Errors import Development.Shake.Internal.FileName import qualified Data.ByteString.Lazy as LBS import Data.Char import Data.Word import Numeric import System.IO import Foreign #if defined(PORTABLE) import System.IO.Error import System.Directory import Data.Time #if __GLASGOW_HASKELL__ < 706 import System.Time #endif #elif defined(mingw32_HOST_OS) import Control.Monad import qualified Data.ByteString.Char8 as BS import Foreign.C.Types import Foreign.C.String #else import GHC.IO.Exception import System.IO.Error import System.Posix.Files.ByteString #endif -- A piece of file information, where 0 and 1 are special (see fileInfo* functions) newtype FileInfo a = FileInfo Word32 deriving (Typeable,Hashable,Binary,Storable,NFData) fileInfoNoHash :: FileInfo FileInfoHash fileInfoNoHash = FileInfo 1 -- Equal to nothing fileInfo :: Word32 -> FileInfo a fileInfo a = FileInfo $ if a > maxBound - 2 then a else a + 2 instance Show (FileInfo a) where show (FileInfo x) | x == 0 = "EQ" | x == 1 = "NEQ" | otherwise = "0x" ++ map toUpper (showHex (x-2) "") instance Eq (FileInfo a) where FileInfo a == FileInfo b | a == 0 || b == 0 = True | a == 1 || b == 1 = False | otherwise = a == b data FileInfoHash; type FileHash = FileInfo FileInfoHash data FileInfoMod ; type ModTime = FileInfo FileInfoMod data FileInfoSize; type FileSize = FileInfo FileInfoSize getFileHash :: FileName -> IO FileHash getFileHash x = withFile (fileNameToString x) ReadMode $ \h -> do s <- LBS.hGetContents h let res = fileInfo $ fromIntegral $ hash s evaluate res return res -- If the result isn't strict then we are referencing a much bigger structure, -- and it causes a space leak I don't really understand on Linux when running -- the 'tar' test, followed by the 'benchmark' test. -- See this blog post: https://neilmitchell.blogspot.co.uk/2015/09/three-space-leaks.html result :: Word32 -> Word32 -> IO (Maybe (ModTime, FileSize)) result x y = do x <- evaluate $ fileInfo x y <- evaluate $ fileInfo y return $ Just (x, y) getFileInfo :: FileName -> IO (Maybe (ModTime, FileSize)) #if defined(PORTABLE) -- Portable fallback getFileInfo x = handleBool isDoesNotExistError (const $ return Nothing) $ do let file = fileNameToString x time <- getModificationTime file size <- withFile file ReadMode hFileSize result (extractFileTime time) (fromIntegral size) -- deal with difference in return type of getModificationTime between directory versions class ExtractFileTime a where extractFileTime :: a -> Word32 #if __GLASGOW_HASKELL__ < 706 instance ExtractFileTime ClockTime where extractFileTime (TOD t _) = fromIntegral t #endif instance ExtractFileTime UTCTime where extractFileTime = floor . fromRational . toRational . utctDayTime #elif defined(mingw32_HOST_OS) -- Directly against the Win32 API, twice as fast as the portable version getFileInfo x = BS.useAsCString (fileNameToByteString x) $ \file -> alloca_WIN32_FILE_ATTRIBUTE_DATA $ \fad -> do res <- c_GetFileAttributesExA file 0 fad code <- peekFileAttributes fad let peek = do code <- peekFileAttributes fad if testBit code 4 then errorDirectoryNotFile $ fileNameToString x else join $ liftM2 result (peekLastWriteTimeLow fad) (peekFileSizeLow fad) if res then peek else if BS.any (>= chr 0x80) (fileNameToByteString x) then withCWString (fileNameToString x) $ \file -> do res <- c_GetFileAttributesExW file 0 fad if res then peek else return Nothing else return Nothing #ifdef x86_64_HOST_ARCH #define CALLCONV ccall #else #define CALLCONV stdcall #endif foreign import CALLCONV unsafe "Windows.h GetFileAttributesExA" c_GetFileAttributesExA :: Ptr CChar -> Int32 -> Ptr WIN32_FILE_ATTRIBUTE_DATA -> IO Bool foreign import CALLCONV unsafe "Windows.h GetFileAttributesExW" c_GetFileAttributesExW :: Ptr CWchar -> Int32 -> Ptr WIN32_FILE_ATTRIBUTE_DATA -> IO Bool data WIN32_FILE_ATTRIBUTE_DATA alloca_WIN32_FILE_ATTRIBUTE_DATA :: (Ptr WIN32_FILE_ATTRIBUTE_DATA -> IO a) -> IO a alloca_WIN32_FILE_ATTRIBUTE_DATA act = allocaBytes size_WIN32_FILE_ATTRIBUTE_DATA act where size_WIN32_FILE_ATTRIBUTE_DATA = 36 peekFileAttributes :: Ptr WIN32_FILE_ATTRIBUTE_DATA -> IO Word32 peekFileAttributes p = peekByteOff p index_WIN32_FILE_ATTRIBUTE_DATA_dwFileAttributes where index_WIN32_FILE_ATTRIBUTE_DATA_dwFileAttributes = 0 peekLastWriteTimeLow :: Ptr WIN32_FILE_ATTRIBUTE_DATA -> IO Word32 peekLastWriteTimeLow p = peekByteOff p index_WIN32_FILE_ATTRIBUTE_DATA_ftLastWriteTime_dwLowDateTime where index_WIN32_FILE_ATTRIBUTE_DATA_ftLastWriteTime_dwLowDateTime = 20 peekFileSizeLow :: Ptr WIN32_FILE_ATTRIBUTE_DATA -> IO Word32 peekFileSizeLow p = peekByteOff p index_WIN32_FILE_ATTRIBUTE_DATA_nFileSizeLow where index_WIN32_FILE_ATTRIBUTE_DATA_nFileSizeLow = 32 #else -- Unix version getFileInfo x = handleBool isDoesNotExistError' (const $ return Nothing) $ do s <- getFileStatus $ fileNameToByteString x if isDirectory s then errorDirectoryNotFile $ fileNameToString x else result (extractFileTime s) (fromIntegral $ fileSize s) where isDoesNotExistError' e = isDoesNotExistError e || ioeGetErrorType e == InappropriateType extractFileTime :: FileStatus -> Word32 #ifndef MIN_VERSION_unix #define MIN_VERSION_unix(a,b,c) 0 #endif #if MIN_VERSION_unix(2,6,0) extractFileTime x = ceiling $ modificationTimeHiRes x * 1e4 -- precision of 0.1ms #else extractFileTime x = fromIntegral $ fromEnum $ modificationTime x #endif #endif shake-0.16.4/src/Development/Shake/Internal/Errors.hs0000644000000000000000000001304213261223301020544 0ustar0000000000000000{-# LANGUAGE DeriveDataTypeable, PatternGuards, RecordWildCards, CPP #-} -- | Errors seen by the user module Development.Shake.Internal.Errors( ShakeException(..), errorInternal, errorStructured, errorNoRuleToBuildType, errorRuleDefinedMultipleTimes, errorMultipleRulesMatch, errorRuleRecursion, errorComplexRecursion, errorNoApply, errorDirectoryNotFile ) where import Data.Tuple.Extra import Control.Exception.Extra import Data.Typeable import Data.List errorInternal :: String -> a errorInternal msg = error $ "Development.Shake: Internal error, please report to Neil Mitchell (" ++ msg ++ ")" alternatives = let (*) = (,) in ["_rule_" * "oracle" ,"_Rule_" * "Oracle" ,"_key_" * "question" ,"_Key_" * "Question" ,"_result_" * "answer" ,"_Result_" * "Answer" ,"_addBuiltinRule_" * "addOracle" ,"_apply_" * "askOracle"] errorStructured :: String -> [(String, Maybe String)] -> String -> IO a errorStructured msg args hint = errorIO $ errorStructuredContents msg args hint errorStructuredContents :: String -> [(String, Maybe String)] -> String -> String errorStructuredContents msg args hint = unlines $ [msg ++ ":"] ++ [" " ++ a ++ [':' | a /= ""] ++ replicate (as - length a + 2) ' ' ++ b | (a,b) <- args2] ++ [hint | hint /= ""] where as = maximum $ 0 : map (length . fst) args2 args2 = [(a,b) | (a,Just b) <- args] structured :: Bool -> String -> [(String, Maybe String)] -> String -> IO a structured alt msg args hint = errorStructured (f msg) (map (first f) args) (f hint) where f = filter (/= '_') . (if alt then g else id) g xs | (a,b):_ <- filter (\(a,b) -> a `isPrefixOf` xs) alternatives = b ++ g (drop (length a) xs) g (x:xs) = x : g xs g [] = [] errorDirectoryNotFile :: FilePath -> IO a errorDirectoryNotFile dir = errorStructured "Build system error - expected a file, got a directory" [("Directory", Just dir)] "Probably due to calling 'need' on a directory. Shake only permits 'need' on files." errorNoRuleToBuildType :: TypeRep -> Maybe String -> Maybe TypeRep -> IO a errorNoRuleToBuildType tk k tv = structured (specialIsOracleKey tk) "Build system error - no _rule_ matches the _key_ type" [("_Key_ type", Just $ show tk) ,("_Key_ value", k) ,("_Result_ type", fmap show tv)] "You are missing a call to _addBuiltinRule_, or your call to _apply_ has the wrong _key_ type" errorRuleDefinedMultipleTimes :: TypeRep-> IO a errorRuleDefinedMultipleTimes tk = structured (specialIsOracleKey tk) "Build system error - _rule_ defined twice at one _key_ type" [("_Key_ type", Just $ show tk)] "You have called _addBuiltinRule_ more than once on the same key type" errorMultipleRulesMatch :: TypeRep -> String -> Int -> IO a errorMultipleRulesMatch tk k count | specialIsOracleKey tk, count == 0 = errorInternal $ "no oracle match for " ++ show tk -- they are always irrifutable rules | specialIsOracleKey tk = errorStructured "Build system error - duplicate oracles for the same question type" [("Question type",Just $ show tk) ,("Question value",Just k)] "Only one call to addOracle is allowed per question type" | otherwise = errorStructured ("Build system error - key matches " ++ (if count == 0 then "no" else "multiple") ++ " rules") [("Key type",Just $ show tk) ,("Key value",Just k) ,("Rules matched",Just $ show count)] (if count == 0 then "Either add a rule that produces the above key, or stop requiring the above key" else "Modify your rules/defaultRules so only one can produce the above key") errorRuleRecursion :: [String] -> TypeRep -> String -> IO a -- may involve both rules and oracle, so report as only rules errorRuleRecursion stack tk k = throwIO $ wrap $ toException $ ErrorCall $ errorStructuredContents "Build system error - recursion detected" [("Key type",Just $ show tk) ,("Key value",Just k)] "Rules may not be recursive" where wrap = if null stack then id else toException . ShakeException (last stack) stack errorComplexRecursion :: [String] -> IO a errorComplexRecursion ks = errorStructured "Build system error - indirect recursion detected" [("Key value " ++ show i, Just k) | (i, k) <- zip [1..] ks] "Rules may not be recursive" errorNoApply :: TypeRep -> Maybe String -> String -> IO a errorNoApply tk k msg = structured (specialIsOracleKey tk) "Build system error - cannot currently call _apply_" [("Reason", Just msg) ,("_Key_ type", Just $ show tk) ,("_Key_ value", k)] "Move the _apply_ call earlier/later" -- Should be in Special, but then we get an import cycle specialIsOracleKey :: TypeRep -> Bool specialIsOracleKey t = con == "OracleQ" where con = show $ fst $ splitTyConApp t -- | Error representing all expected exceptions thrown by Shake. -- Problems when executing rules will be raising using this exception type. data ShakeException = ShakeException {shakeExceptionTarget :: String -- ^ The target that was being built when the exception occured. ,shakeExceptionStack :: [String] -- ^ The stack of targets, where the 'shakeExceptionTarget' is last. ,shakeExceptionInner :: SomeException -- ^ The underlying exception that was raised. } deriving Typeable instance Exception ShakeException instance Show ShakeException where show ShakeException{..} = unlines $ "Error when running Shake build system:" : map ("* " ++) shakeExceptionStack ++ [displayException shakeExceptionInner] shake-0.16.4/src/Development/Shake/Internal/Derived.hs0000644000000000000000000003121113261223301020650 0ustar0000000000000000{-# LANGUAGE ScopedTypeVariables #-} module Development.Shake.Internal.Derived( copyFile', copyFileChanged, readFile', readFileLines, writeFile', writeFileLines, writeFileChanged, withTempFile, withTempDir, withTempFileWithin, withTempDirWithin, getHashedShakeVersion, getShakeExtra, getShakeExtraRules, addShakeExtra, par, forP, newResource, newThrottle, withResources, newCache ) where import Control.Applicative import Control.Monad.Extra import Control.Monad.IO.Class import System.Directory import System.FilePath (takeDirectory) import System.IO import qualified System.IO.Extra as IO import Development.Shake.Internal.Errors import Development.Shake.Internal.Core.Run import Development.Shake.Internal.Core.Rules import Development.Shake.Internal.Options import Development.Shake.Internal.Rules.File import qualified Data.ByteString as BS import qualified Data.HashMap.Strict as Map import General.Extra import Data.List.Extra import Data.Hashable import Data.Typeable.Extra import Data.Dynamic import Prelude -- | Get a checksum of a list of files, suitable for using as `shakeVersion`. -- This will trigger a rebuild when the Shake rules defined in any of the files are changed. -- For example: -- -- @ -- main = do -- ver <- 'getHashedShakeVersion' [\"Shakefile.hs\"] -- 'shakeArgs' 'shakeOptions'{'shakeVersion' = ver} ... -- @ -- -- To automatically detect the name of the current file, turn on the @TemplateHaskell@ -- extension and write @$(LitE . StringL . loc_filename \<$\> location)@. -- -- This feature can be turned off during development by passing -- the flag @--no-rule-version@ or setting 'shakeVersionIgnore' to 'True'. getHashedShakeVersion :: [FilePath] -> IO String getHashedShakeVersion files = do hashes <- mapM (fmap (hashWithSalt 0) . BS.readFile) files return $ "hash-" ++ show (hashWithSalt 0 hashes) -- | Get an item from 'shakeExtra', using the requested type as the key. Fails -- if the value found at this key does not match the requested type. getShakeExtra :: Typeable a => Action (Maybe a) getShakeExtra = liftIO . lookupShakeExtra . shakeExtra =<< getShakeOptions -- | A version of 'getShakeExtra' in 'Rules'. getShakeExtraRules :: Typeable a => Rules (Maybe a) getShakeExtraRules = liftIO . lookupShakeExtra . shakeExtra =<< getShakeOptionsRules lookupShakeExtra :: forall a . Typeable a => Map.HashMap TypeRep Dynamic -> IO (Maybe a) lookupShakeExtra mp = case Map.lookup want mp of Just dyn | Just x <- fromDynamic dyn -> return $ Just x | otherwise -> errorStructured "shakeExtra value is malformed, all keys and values must agree" [("Key", Just $ show want) ,("Value", Just $ show $ dynTypeRep dyn)] "Use addShakeExtra to ensure shakeExtra is well-formed" Nothing -> return Nothing where want = typeRep (Proxy :: Proxy a) -- | Add a properly structued value to 'shakeExtra' which can be retrieved with 'getShakeExtra'. addShakeExtra :: Typeable a => a -> Map.HashMap TypeRep Dynamic -> Map.HashMap TypeRep Dynamic addShakeExtra x = Map.insert (typeOf x) (toDyn x) -- | @copyFile' old new@ copies the existing file from @old@ to @new@. -- The @old@ file will be tracked as a dependency. -- Also creates the new directory if necessary. copyFile' :: FilePath -> FilePath -> Action () copyFile' old new = do need [old] putLoud $ "Copying from " ++ old ++ " to " ++ new liftIO $ do createDirectoryRecursive $ takeDirectory new copyFile old new -- | @copyFileChanged old new@ copies the existing file from @old@ to @new@, if the contents have changed. -- The @old@ file will be tracked as a dependency. -- Also creates the new directory if necessary. copyFileChanged :: FilePath -> FilePath -> Action () copyFileChanged old new = do need [old] -- in newer versions of the directory package we can use copyFileWithMetadata which (we think) updates -- the timestamp as well and thus no need to read the source file twice. unlessM (liftIO $ doesFileExist new &&^ IO.fileEq old new) $ do putLoud $ "Copying from " ++ old ++ " to " ++ new liftIO $ do createDirectoryRecursive $ takeDirectory new -- copyFile does a lot of clever stuff with permissions etc, so make sure we just reuse it liftIO $ copyFile old new -- | Read a file, after calling 'need'. The argument file will be tracked as a dependency. readFile' :: FilePath -> Action String readFile' x = need [x] >> liftIO (readFile x) -- | Write a file, lifted to the 'Action' monad. writeFile' :: MonadIO m => FilePath -> String -> m () writeFile' name x = liftIO $ do createDirectoryRecursive $ takeDirectory name writeFile name x -- | A version of 'readFile'' which also splits the result into lines. -- The argument file will be tracked as a dependency. readFileLines :: FilePath -> Action [String] readFileLines = fmap lines . readFile' -- | A version of 'writeFile'' which writes out a list of lines. writeFileLines :: MonadIO m => FilePath -> [String] -> m () writeFileLines name = writeFile' name . unlines -- | Write a file, but only if the contents would change. writeFileChanged :: MonadIO m => FilePath -> String -> m () writeFileChanged name x = liftIO $ do createDirectoryRecursive $ takeDirectory name b <- doesFileExist name if not b then writeFile name x else do -- Cannot use ByteString here, since it has different line handling -- semantics on Windows b <- withFile name ReadMode $ \h -> do src <- hGetContents h return $! src /= x when b $ writeFile name x -- | Create a temporary file in the temporary directory. The file will be deleted -- after the action completes (provided the file is not still open). -- The 'FilePath' will not have any file extension, will exist, and will be zero bytes long. -- If you require a file with a specific name, use 'withTempDir'. withTempFile :: (FilePath -> Action a) -> Action a withTempFile act = do (file, del) <- liftIO IO.newTempFile act file `actionFinally` del -- | Like 'withTempFile' but using a custom temporary directory. withTempFileWithin :: FilePath -> (FilePath -> Action a) -> Action a withTempFileWithin tdir act = do (file, del) <- liftIO $ IO.newTempFileWithin tdir act file `actionFinally` del -- | Create a temporary directory inside the system temporary directory. -- The directory will be deleted after the action completes. As an example: -- -- @ -- 'withTempDir' $ \\mydir -> do -- 'putNormal' $ \"Temp directory is \" ++ mydir -- 'writeFile'' (mydir \ \"test.txt\") \"writing out a temp file\" -- @ withTempDir :: (FilePath -> Action a) -> Action a withTempDir act = do (dir,del) <- liftIO IO.newTempDir act dir `actionFinally` del -- | Like 'withTempDir' but using a custom temporary directory. withTempDirWithin :: FilePath -> (FilePath -> Action a) -> Action a withTempDirWithin tdir act = do (dir,del) <- liftIO $ IO.newTempDirWithin tdir act dir `actionFinally` del -- | A 'parallel' version of 'forM'. forP :: [a] -> (a -> Action b) -> Action [b] forP xs f = parallel $ map f xs -- | Execute two operations in parallel, based on 'parallel'. par :: Action a -> Action b -> Action (a,b) par a b = (\[Left a, Right b] -> (a,b)) <$> parallel [Left <$> a, Right <$> b] -- | Create a finite resource, given a name (for error messages) and a quantity of the resource that exists. -- Shake will ensure that actions using the same finite resource do not execute in parallel. -- As an example, only one set of calls to the Excel API can occur at one time, therefore -- Excel is a finite resource of quantity 1. You can write: -- -- @ -- 'Development.Shake.shake' 'Development.Shake.shakeOptions'{'Development.Shake.shakeThreads'=2} $ do -- 'Development.Shake.want' [\"a.xls\",\"b.xls\"] -- excel <- 'Development.Shake.newResource' \"Excel\" 1 -- \"*.xls\" 'Development.Shake.%>' \\out -> -- 'Development.Shake.withResource' excel 1 $ -- 'Development.Shake.cmd' \"excel\" out ... -- @ -- -- Now the two calls to @excel@ will not happen in parallel. -- -- As another example, calls to compilers are usually CPU bound but calls to linkers are usually -- disk bound. Running 8 linkers will often cause an 8 CPU system to grid to a halt. We can limit -- ourselves to 4 linkers with: -- -- @ -- disk <- 'Development.Shake.newResource' \"Disk\" 4 -- 'Development.Shake.want' [show i 'Development.Shake.FilePath.<.>' \"exe\" | i <- [1..100]] -- \"*.exe\" 'Development.Shake.%>' \\out -> -- 'Development.Shake.withResource' disk 1 $ -- 'Development.Shake.cmd' \"ld -o\" [out] ... -- \"*.o\" 'Development.Shake.%>' \\out -> -- 'Development.Shake.cmd' \"cl -o\" [out] ... -- @ newResource :: String -> Int -> Rules Resource newResource name mx = liftIO $ newResourceIO name mx -- | Create a throttled resource, given a name (for error messages) and a number of resources (the 'Int') that can be -- used per time period (the 'Double' in seconds). Shake will ensure that actions using the same throttled resource -- do not exceed the limits. As an example, let us assume that making more than 1 request every 5 seconds to -- Google results in our client being blacklisted, we can write: -- -- @ -- google <- 'Development.Shake.newThrottle' \"Google\" 1 5 -- \"*.url\" 'Development.Shake.%>' \\out -> do -- 'Development.Shake.withResource' google 1 $ -- 'Development.Shake.cmd' \"wget\" [\"https:\/\/google.com?q=\" ++ 'Development.Shake.FilePath.takeBaseName' out] \"-O\" [out] -- @ -- -- Now we will wait at least 5 seconds after querying Google before performing another query. If Google change the rules to -- allow 12 requests per minute we can instead use @'Development.Shake.newThrottle' \"Google\" 12 60@, which would allow -- greater parallelisation, and avoid throttling entirely if only a small number of requests are necessary. -- -- In the original example we never make a fresh request until 5 seconds after the previous request has /completed/. If we instead -- want to throttle requests since the previous request /started/ we can write: -- -- @ -- google <- 'Development.Shake.newThrottle' \"Google\" 1 5 -- \"*.url\" 'Development.Shake.%>' \\out -> do -- 'Development.Shake.withResource' google 1 $ return () -- 'Development.Shake.cmd' \"wget\" [\"https:\/\/google.com?q=\" ++ 'Development.Shake.FilePath.takeBaseName' out] \"-O\" [out] -- @ -- -- However, the rule may not continue running immediately after 'Development.Shake.withResource' completes, so while -- we will never exceed an average of 1 request every 5 seconds, we may end up running an unbounded number of -- requests simultaneously. If this limitation causes a problem in practice it can be fixed. newThrottle :: String -> Int -> Double -> Rules Resource newThrottle name count period = liftIO $ newThrottleIO name count period -- | Run an action which uses part of several finite resources. Acquires the resources in a stable -- order, to prevent deadlock. If all rules requiring more than one resource acquire those -- resources with a single call to 'withResources', resources will not deadlock. withResources :: [(Resource, Int)] -> Action a -> Action a withResources res act | (r,i):_ <- filter ((< 0) . snd) res = error $ "You cannot acquire a negative quantity of " ++ show r ++ ", requested " ++ show i | otherwise = f $ groupSort res where f [] = act f ((r,xs):rs) = withResource r (sum xs) $ f rs -- | Given an action on a key, produce a cached version that will execute the action at most once per key per run. -- Using the cached result will still result include any dependencies that the action requires. -- Each call to 'newCache' creates a separate cache that is independent of all other calls to 'newCache'. -- The operations will not be cached between runs and nothing will be persisted to the Shake database. -- -- This function is useful when creating files that store intermediate values, -- to avoid the overhead of repeatedly reading from disk, particularly if the file requires expensive parsing. -- As an example: -- -- @ -- digits \<- 'newCache' $ \\file -> do -- src \<- readFile\' file -- return $ length $ filter isDigit src -- \"*.digits\" 'Development.Shake.%>' \\x -> do -- v1 \<- digits ('dropExtension' x) -- v2 \<- digits ('dropExtension' x) -- 'Development.Shake.writeFile'' x $ show (v1,v2) -- @ -- -- To create the result @MyFile.txt.digits@ the file @MyFile.txt@ will be read and counted, but only at most -- once per execution. newCache :: (Eq k, Hashable k) => (k -> Action v) -> Rules (k -> Action v) newCache = liftIO . newCacheIO shake-0.16.4/src/Development/Shake/Internal/Demo.hs0000644000000000000000000001014313261223301020153 0ustar0000000000000000 -- | Demo tutorial, accessed with --demo module Development.Shake.Internal.Demo(demo) where import Development.Shake.Internal.Paths import Development.Shake.Command import Control.Applicative import Control.Exception.Extra import Control.Monad import Data.Char import Data.List import Data.Maybe import System.Directory import System.Exit import System.FilePath import General.Extra import Development.Shake.FilePath(exe) import System.IO import System.Info.Extra import Prelude demo :: Bool -> IO () demo auto = do hSetBuffering stdout NoBuffering putStrLn $ "% Welcome to the Shake v" ++ shakeVersionString ++ " demo mode!" putStr "% Detecting machine configuration... " hasManual <- hasManualData ghc <- isJust <$> findExecutable "ghc" (gcc, gccPath) <- findGcc shakeLib <- wrap $ fmap (not . null . words . fromStdout) (cmd "ghc-pkg list --simple-output shake") ninja <- findExecutable "ninja" putStrLn "done\n" let path = if isWindows then "%PATH%" else "$PATH" require ghc $ "% You don't have 'ghc' on your " ++ path ++ ", which is required to run the demo." require gcc $ "% You don't have 'gcc' on your " ++ path ++ ", which is required to run the demo." require shakeLib "% You don't have the 'shake' library installed with GHC, which is required to run the demo." require hasManual "% You don't have the Shake data files installed, which are required to run the demo." empty <- not . any (not . all (== '.')) <$> getDirectoryContents "." dir <- if empty then getCurrentDirectory else do home <- getHomeDirectory dir <- getDirectoryContents home return $ home head (map ("shake-demo" ++) ("":map show [2..]) \\ dir) putStrLn "% The Shake demo uses an empty directory, OK to use:" putStrLn $ "% " ++ dir b <- yesNo auto require b "% Please create an empty directory to run the demo from, then run 'shake --demo' again." putStr "% Copying files... " copyManualData dir unless isWindows $ do p <- getPermissions $ dir "build.sh" setPermissions (dir "build.sh") p{executable=True} putStrLn "done" let pause = do putStr "% Press ENTER to continue: " if auto then putLine "" else getLine let execute x = do putStrLn $ "% RUNNING: " ++ x cmd (Cwd dir) (AddPath [] (maybeToList gccPath)) Shell x :: IO () let build = if isWindows then "build" else "./build.sh" putStrLn "\n% [1/5] Building an example project with Shake." pause putStrLn $ "% RUNNING: cd " ++ dir execute build putStrLn "\n% [2/5] Running the produced example." pause execute $ "_build" "run" <.> exe putStrLn "\n% [3/5] Rebuilding an example project with Shake (nothing should change)." pause execute build putStrLn "\n% [4/5] Cleaning the build." pause execute $ build ++ " clean" putStrLn "\n% [5/5] Rebuilding with 2 threads and profiling." pause execute $ build ++ " -j2 --report --report=-" putStrLn "\n% See the profiling summary above, or look at the HTML profile report in" putStrLn $ "% " ++ dir "report.html" putStrLn "\n% Demo complete - all the examples can be run from:" putStrLn $ "% " ++ dir putStrLn "% For more info see https://shakebuild.com" when (isJust ninja) $ do putStrLn "\n% PS. Shake can also execute Ninja build files" putStrLn "% For more info see https://shakebuild.com/ninja" -- | Require the user to press @y@ before continuing. yesNo :: Bool -> IO Bool yesNo auto = do putStr "% [Y/N] (then ENTER): " x <- if auto then putLine "y" else fmap (map toLower) getLine if "y" `isPrefixOf` x then return True else if "n" `isPrefixOf` x then return False else yesNo auto putLine :: String -> IO String putLine x = putStrLn x >> return x -- | Replace exceptions with 'False'. wrap :: IO Bool -> IO Bool wrap act = act `catch_` const (return False) -- | Require a condition to be true, or exit with a message. require :: Bool -> String -> IO () require b msg = unless b $ putStrLn msg >> exitFailure shake-0.16.4/src/Development/Shake/Internal/CmdOption.hs0000644000000000000000000000461513261223301021172 0ustar0000000000000000{-# LANGUAGE DeriveDataTypeable #-} module Development.Shake.Internal.CmdOption(CmdOption(..)) where import Data.Data import qualified Data.ByteString.Lazy.Char8 as LBS -- | Options passed to 'command' or 'cmd' to control how processes are executed. data CmdOption = Cwd FilePath -- ^ Change the current directory in the spawned process. By default uses this processes current directory. | Env [(String,String)] -- ^ Change the environment variables in the spawned process. By default uses this processes environment. | AddEnv String String -- ^ Add an environment variable in the child process. | RemEnv String -- ^ Remove an environment variable from the child process. | AddPath [String] [String] -- ^ Add some items to the prefix and suffix of the @$PATH@ variable. | Stdin String -- ^ Given as the @stdin@ of the spawned process. By default the @stdin@ is inherited. | StdinBS LBS.ByteString -- ^ Given as the @stdin@ of the spawned process. | FileStdin FilePath -- ^ Take the @stdin@ from a file. | Shell -- ^ Pass the command to the shell without escaping - any arguments will be joined with spaces. By default arguments are escaped properly. | BinaryPipes -- ^ Treat the @stdin@\/@stdout@\/@stderr@ messages as binary. By default 'String' results use text encoding and 'ByteString' results use binary encoding. | Traced String -- ^ Name to use with 'traced', or @\"\"@ for no tracing. By default traces using the name of the executable. | Timeout Double -- ^ Abort the computation after N seconds, will raise a failure exit code. Calls 'interruptProcessGroupOf' and 'terminateProcess', but may sometimes fail to abort the process and not timeout. | WithStdout Bool -- ^ Should I include the @stdout@ in the exception if the command fails? Defaults to 'False'. | WithStderr Bool -- ^ Should I include the @stderr@ in the exception if the command fails? Defaults to 'True'. | EchoStdout Bool -- ^ Should I echo the @stdout@? Defaults to 'True' unless a 'Stdout' result is required or you use 'FileStdout'. | EchoStderr Bool -- ^ Should I echo the @stderr@? Defaults to 'True' unless a 'Stderr' result is required or you use 'FileStderr'. | FileStdout FilePath -- ^ Should I put the @stdout@ to a file. | FileStderr FilePath -- ^ Should I put the @stderr@ to a file. | AutoDeps -- ^ Compute dependencies automatically. deriving (Eq,Ord,Show,Data,Typeable) shake-0.16.4/src/Development/Shake/Internal/Args.hs0000644000000000000000000004416113261223301020172 0ustar0000000000000000{-# LANGUAGE TupleSections #-} -- | Command line parsing flags. module Development.Shake.Internal.Args( shakeOptDescrs, shakeArgs, shakeArgsWith, shakeArgsOptionsWith ) where import Development.Shake.Internal.Paths import Development.Shake.Internal.Options import Development.Shake.Internal.Core.Rules import Development.Shake.Internal.Demo import Development.Shake.FilePath import Development.Shake.Internal.Rules.File import Development.Shake.Internal.Progress import Development.Shake.Internal.Shake import General.Timing import General.GetOpt import Data.Tuple.Extra import Control.Concurrent import Control.Exception.Extra import Control.Monad import Data.Char import Data.Either import Data.Functor import Data.List import Data.Maybe import System.Directory.Extra import System.Environment import System.Exit import System.Time.Extra import Prelude -- | Run a build system using command line arguments for configuration. -- The available flags are those from 'shakeOptDescrs', along with a few additional -- @make@ compatible flags that are not represented in 'ShakeOptions', such as @--print-directory@. -- If there are no file arguments then the 'Rules' are used directly, otherwise the file arguments -- are 'want'ed (after calling 'withoutActions'). As an example: -- -- @ -- main = 'shakeArgs' 'shakeOptions'{'shakeFiles' = \"_make\", 'shakeProgress' = 'progressSimple'} $ do -- 'phony' \"clean\" $ 'Development.Shake.removeFilesAfter' \"_make\" [\"\/\/*\"] -- 'want' [\"_make\/neil.txt\",\"_make\/emily.txt\"] -- \"_make\/*.txt\" '%>' \\out -> -- ... build action here ... -- @ -- -- This build system will default to building @neil.txt@ and @emily.txt@, while showing progress messages, -- and putting the Shake files in locations such as @_make\/.database@. Some example command line flags: -- -- * @main --no-progress@ will turn off progress messages. -- -- * @main -j6@ will build on 6 threads. -- -- * @main --help@ will display a list of supported flags. -- -- * @main clean@ will not build anything, but will remove the @_make@ directory, including the -- any 'shakeFiles'. -- -- * @main _make/henry.txt@ will not build @neil.txt@ or @emily.txt@, but will instead build @henry.txt@. shakeArgs :: ShakeOptions -> Rules () -> IO () shakeArgs opts rules = shakeArgsWith opts [] f where f _ files = return $ Just $ if null files then rules else want files >> withoutActions rules -- | A version of 'shakeArgs' with more flexible handling of command line arguments. -- The caller of 'shakeArgsWith' can add additional flags (the second argument) and chose how to convert -- the flags/arguments into rules (the third argument). Given: -- -- @ -- 'shakeArgsWith' opts flags (\\flagValues argValues -> result) -- @ -- -- * @opts@ is the initial 'ShakeOptions' value, which may have some fields overriden by command line flags. -- This argument is usually 'shakeOptions', perhaps with a few fields overriden. -- -- * @flags@ is a list of flag descriptions, which either produce a 'String' containing an error -- message (typically for flags with invalid arguments, .e.g. @'Left' \"could not parse as int\"@), or a value -- that is passed as @flagValues@. If you have no custom flags, pass @[]@. -- -- * @flagValues@ is a list of custom flags that the user supplied. If @flags == []@ then this list will -- be @[]@. -- -- * @argValues@ is a list of non-flag arguments, which are often treated as files and passed to 'want'. -- -- * @result@ should produce a 'Nothing' to indicate that no building needs to take place, or a 'Just' -- providing the rules that should be used. -- -- As an example of a build system that can use either @gcc@ or @distcc@ for compiling: -- -- @ -- import System.Console.GetOpt -- -- data Flags = DistCC deriving Eq -- flags = [Option \"\" [\"distcc\"] (NoArg $ Right DistCC) \"Run distributed.\"] -- -- main = 'shakeArgsWith' 'shakeOptions' flags $ \\flags targets -> return $ Just $ do -- if null targets then 'want' [\"result.exe\"] else 'want' targets -- let compiler = if DistCC \`elem\` flags then \"distcc\" else \"gcc\" -- \"*.o\" '%>' \\out -> do -- 'need' ... -- 'cmd' compiler ... -- ... -- @ -- -- Now you can pass @--distcc@ to use the @distcc@ compiler. shakeArgsWith :: ShakeOptions -> [OptDescr (Either String a)] -> ([a] -> [String] -> IO (Maybe (Rules ()))) -> IO () shakeArgsWith opt args f = shakeArgsOptionsWith opt args $ \so a b -> fmap (so,) <$> f a b -- | Like 'shakeArgsWith', but also lets you manipulate the 'ShakeOptions'. shakeArgsOptionsWith :: ShakeOptions -> [OptDescr (Either String a)] -> (ShakeOptions -> [a] -> [String] -> IO (Maybe (ShakeOptions, Rules ()))) -> IO () shakeArgsOptionsWith baseOpts userOptions rules = do addTiming "shakeArgsWith" args <- getArgs let (flag1,files,errs) = getOpt opts args (self,user) = partitionEithers flag1 (flagsExtra,flagsShake) = first concat $ unzip self progressReplays = [x | ProgressReplay x <- flagsExtra] progressRecords = [x | ProgressRecord x <- flagsExtra] changeDirectory = listToMaybe [x | ChangeDirectory x <- flagsExtra] printDirectory = last $ False : [x | PrintDirectory x <- flagsExtra] oshakeOpts = foldl' (flip ($)) baseOpts flagsShake shakeOpts = oshakeOpts {shakeLintInside = map (toStandard . normalise . addTrailingPathSeparator) $ shakeLintInside oshakeOpts ,shakeLintIgnore = map toStandard $ shakeLintIgnore oshakeOpts ,shakeOutput = if shakeColor oshakeOpts then outputColor (shakeOutput oshakeOpts) else shakeOutput oshakeOpts } let putWhen v msg = when (shakeVerbosity oshakeOpts >= v) $ shakeOutput oshakeOpts v msg let putWhenLn v msg = putWhen v $ msg ++ "\n" let showHelp = do progName <- getProgName putWhen Quiet $ unlines $ ("Usage: " ++ progName ++ " [options] [target] ...") : "Options:" : showOptDescr opts when (errs /= []) $ do putWhen Quiet $ unlines $ map ("shake: " ++) $ filter (not . null) $ lines $ unlines errs showHelp exitFailure if Help `elem` flagsExtra then showHelp else if Version `elem` flagsExtra then putWhenLn Normal $ "Shake build system, version " ++ shakeVersionString else if NumericVersion `elem` flagsExtra then putWhenLn Normal shakeVersionString else if Demo `elem` flagsExtra then demo $ shakeStaunch shakeOpts else if not $ null progressReplays then do dat <- forM progressReplays $ \file -> do src <- readFile file return (file, map read $ lines src) forM_ (if null $ shakeReport shakeOpts then ["-"] else shakeReport shakeOpts) $ \file -> do putWhenLn Normal $ "Writing report to " ++ file writeProgressReport file dat else do when (Sleep `elem` flagsExtra) $ threadDelay 1000000 start <- offsetTime initDataDirectory -- must be done before we start changing directory let redir = maybe id withCurrentDirectory changeDirectory shakeOpts <- if null progressRecords then return shakeOpts else do t <- offsetTime return shakeOpts{shakeProgress = \p -> bracket (forkIO $ shakeProgress shakeOpts p) killThread $ const $ progressDisplay 1 (const $ return ()) $ do p <- p t <- t forM_ progressRecords $ \file -> appendFile file $ show (t,p) ++ "\n" return p } (ran,shakeOpts,res) <- redir $ do when printDirectory $ do curdir <- getCurrentDirectory putWhenLn Normal $ "shake: In directory `" ++ curdir ++ "'" rules <- rules shakeOpts user files case rules of Nothing -> return (False, shakeOpts, Right ()) Just (shakeOpts, rules) -> do res <- try_ $ shake shakeOpts $ if NoBuild `elem` flagsExtra then withoutActions rules else rules return (True, shakeOpts, res) if not ran || shakeVerbosity shakeOpts < Normal || NoTime `elem` flagsExtra then either throwIO return res else let esc = if shakeColor shakeOpts then escape else flip const in case res of Left err -> if Exception `elem` flagsExtra then throwIO err else do putWhenLn Quiet $ esc "31" $ show err exitFailure Right () -> do tot <- start let (mins,secs) = divMod (ceiling tot) (60 :: Int) time = show mins ++ ":" ++ ['0' | secs < 10] ++ show secs putWhenLn Normal $ esc "32" $ "Build completed in " ++ time ++ "m" where opts = removeOverlap userOptions (map snd shakeOptsEx) `mergeOptDescr` userOptions -- | A list of command line options that can be used to modify 'ShakeOptions'. Each option returns -- either an error message (invalid argument to the flag) or a function that changes some fields -- in 'ShakeOptions'. The command line flags are @make@ compatible where possbile, but additional -- flags have been added for the extra options Shake supports. shakeOptDescrs :: [OptDescr (Either String (ShakeOptions -> ShakeOptions))] shakeOptDescrs = [fmapOptDescr snd o | (True, o) <- shakeOptsEx] data Extra = ChangeDirectory FilePath | Version | NumericVersion | PrintDirectory Bool | Help | Sleep | NoTime | Exception | NoBuild | ProgressRecord FilePath | ProgressReplay FilePath | Demo deriving Eq unescape :: String -> String unescape ('\ESC':'[':xs) = unescape $ drop 1 $ dropWhile (not . isAlpha) xs unescape (x:xs) = x : unescape xs unescape [] = [] escape :: String -> String -> String escape code x = "\ESC[" ++ code ++ "m" ++ x ++ "\ESC[0m" outputColor :: (Verbosity -> String -> IO ()) -> Verbosity -> String -> IO () outputColor output v msg = output v $ escape "34" msg -- | True if it has a potential effect on ShakeOptions shakeOptsEx :: [(Bool, OptDescr (Either String ([Extra], ShakeOptions -> ShakeOptions)))] shakeOptsEx = [yes $ Option "a" ["abbrev"] (pairArg "abbrev" "FULL=SHORT" $ \a s -> s{shakeAbbreviations=shakeAbbreviations s ++ [a]}) "Use abbreviation in status messages." ,no $ Option "" ["no-build"] (NoArg $ Right ([NoBuild], id)) "Don't build anything." ,no $ Option "C" ["directory"] (ReqArg (\x -> Right ([ChangeDirectory x],id)) "DIRECTORY") "Change to DIRECTORY before doing anything." ,yes $ Option "" ["color","colour"] (noArg $ \s -> s{shakeColor=True}) "Colorize the output." ,no $ Option "" ["no-color","no-colour"] (noArg $ \s -> s{shakeColor=False}) "Don't colorize the output." ,yes $ Option "d" ["debug"] (OptArg (\x -> Right ([], \s -> s{shakeVerbosity=Diagnostic, shakeOutput=outputDebug (shakeOutput s) x})) "FILE") "Print lots of debugging information." ,no $ Option "" ["demo"] (NoArg $ Right ([Demo], id)) "Run in demo mode." ,yes $ Option "" ["digest"] (NoArg $ Right ([], \s -> s{shakeChange=ChangeDigest})) "Files change when digest changes." ,yes $ Option "" ["digest-and"] (NoArg $ Right ([], \s -> s{shakeChange=ChangeModtimeAndDigest})) "Files change when modtime and digest change." ,yes $ Option "" ["digest-and-input"] (NoArg $ Right ([], \s -> s{shakeChange=ChangeModtimeAndDigestInput})) "Files change on modtime (and digest for inputs)." ,yes $ Option "" ["digest-or"] (NoArg $ Right ([], \s -> s{shakeChange=ChangeModtimeOrDigest})) "Files change when modtime or digest change." ,yes $ Option "" ["digest-not"] (NoArg $ Right ([], \s -> s{shakeChange=ChangeModtime})) "Files change when modtime changes." ,no $ Option "" ["exception"] (NoArg $ Right ([Exception], id)) "Throw exceptions directly." ,yes $ Option "" ["flush"] (intArg 1 "flush" "N" (\i s -> s{shakeFlush=Just i})) "Flush metadata every N seconds." ,yes $ Option "" ["never-flush"] (noArg $ \s -> s{shakeFlush=Nothing}) "Never explicitly flush metadata." ,no $ Option "h" ["help"] (NoArg $ Right ([Help],id)) "Print this message and exit." ,yes $ Option "j" ["jobs"] (optIntArg 0 "jobs" "N" $ \i s -> s{shakeThreads=fromMaybe 0 i}) "Allow N jobs/threads at once [default CPUs]." ,yes $ Option "k" ["keep-going"] (noArg $ \s -> s{shakeStaunch=True}) "Keep going when some targets can't be made." ,yes $ Option "l" ["lint"] (noArg $ \s -> s{shakeLint=Just LintBasic}) "Perform limited validation after the run." ,yes $ Option "" ["lint-fsatrace"] (noArg $ \s -> s{shakeLint=Just LintFSATrace}) "Use fsatrace to do validation." ,yes $ Option "" ["no-lint"] (noArg $ \s -> s{shakeLint=Nothing}) "Turn off --lint." ,yes $ Option "" ["live"] (OptArg (\x -> Right ([], \s -> s{shakeLiveFiles=shakeLiveFiles s ++ [fromMaybe "live.txt" x]})) "FILE") "List the files that are live [to live.txt]." ,yes $ Option "m" ["metadata"] (reqArg "PREFIX" $ \x s -> s{shakeFiles=x}) "Prefix for storing metadata files." ,no $ Option "" ["numeric-version"] (NoArg $ Right ([NumericVersion],id)) "Print just the version number and exit." ,yes $ Option "" ["skip-commands"] (noArg $ \s -> s{shakeRunCommands=False}) "Try and avoid running external programs." ,yes $ Option "" ["rebuild"] (OptArg (\x -> Right ([], \s -> s{shakeRebuild=shakeRebuild s ++ [(RebuildNow, fromMaybe "**" x)]})) "PATTERN") "Rebuild matching files." ,yes $ Option "" ["no-rebuild"] (OptArg (\x -> Right ([], \s -> s{shakeRebuild=shakeRebuild s ++ [(RebuildNormal, fromMaybe "**" x)]})) "PATTERN") "Rebuild matching files if necessary." ,yes $ Option "" ["skip"] (OptArg (\x -> Right ([], \s -> s{shakeRebuild=shakeRebuild s ++ [(RebuildLater, fromMaybe "**" x)]})) "PATTERN") "Don't rebuild matching files this run." -- ,yes $ Option "" ["skip-forever"] (OptArg (\x -> Right ([], \s -> s{shakeRebuild=shakeRebuild s ++ [(RebuildNever, fromMaybe "**" x)]})) "PATTERN") "Don't rebuild matching files until they change." ,yes $ Option "r" ["report","profile"] (OptArg (\x -> Right ([], \s -> s{shakeReport=shakeReport s ++ [fromMaybe "report.html" x]})) "FILE") "Write out profiling information [to report.html]." ,yes $ Option "" ["no-reports"] (noArg $ \s -> s{shakeReport=[]}) "Turn off --report." ,yes $ Option "" ["rule-version"] (reqArg "VERSION" $ \x s -> s{shakeVersion=x}) "Version of the build rules." ,yes $ Option "" ["no-rule-version"] (noArg $ \s -> s{shakeVersionIgnore=True}) "Ignore the build rules version." ,yes $ Option "s" ["silent"] (noArg $ \s -> s{shakeVerbosity=Silent}) "Don't print anything." ,no $ Option "" ["sleep"] (NoArg $ Right ([Sleep],id)) "Sleep for a second before building." ,yes $ Option "S" ["no-keep-going","stop"] (noArg $ \s -> s{shakeStaunch=False}) "Turns off -k." ,yes $ Option "" ["storage"] (noArg $ \s -> s{shakeStorageLog=True}) "Write a storage log." ,yes $ Option "p" ["progress"] (progress $ optIntArg 1 "progress" "N" $ \i s -> s{shakeProgress=prog $ fromMaybe 5 i}) "Show progress messages [every N secs, default 5]." ,yes $ Option "" ["no-progress"] (noArg $ \s -> s{shakeProgress=const $ return ()}) "Don't show progress messages." ,yes $ Option "q" ["quiet"] (noArg $ \s -> s{shakeVerbosity=move (shakeVerbosity s) pred}) "Print less (pass repeatedly for even less)." ,no $ Option "" ["no-time"] (NoArg $ Right ([NoTime],id)) "Don't print build time." ,yes $ Option "" ["timings"] (noArg $ \s -> s{shakeTimings=True}) "Print phase timings." ,yes $ Option "V" ["verbose","trace"] (noArg $ \s -> s{shakeVerbosity=move (shakeVerbosity s) succ}) "Print more (pass repeatedly for even more)." ,no $ Option "v" ["version"] (NoArg $ Right ([Version],id)) "Print the version number and exit." ,no $ Option "w" ["print-directory"] (NoArg $ Right ([PrintDirectory True],id)) "Print the current directory." ,no $ Option "" ["no-print-directory"] (NoArg $ Right ([PrintDirectory False],id)) "Turn off -w, even if it was turned on implicitly." ] where yes = (,) True no = (,) False move :: Verbosity -> (Int -> Int) -> Verbosity move x by = toEnum $ min (fromEnum mx) $ max (fromEnum mn) $ by $ fromEnum x where (mn,mx) = (asTypeOf minBound x, asTypeOf maxBound x) noArg f = NoArg $ Right ([], f) reqArg a f = ReqArg (\x -> Right ([], f x)) a intArg mn flag a f = flip ReqArg a $ \x -> case reads x of [(i,"")] | i >= mn -> Right ([],f i) _ -> Left $ "the `--" ++ flag ++ "' option requires a number, " ++ show mn ++ " or above" optIntArg mn flag a f = flip OptArg a $ maybe (Right ([], f Nothing)) $ \x -> case reads x of [(i,"")] | i >= mn -> Right ([],f $ Just i) _ -> Left $ "the `--" ++ flag ++ "' option requires a number, " ++ show mn ++ " or above" pairArg flag a f = flip ReqArg a $ \x -> case break (== '=') x of (a,'=':b) -> Right ([],f (a,b)) _ -> Left $ "the `--" ++ flag ++ "' option requires an = in the argument" progress (OptArg func msg) = flip OptArg msg $ \x -> case break (== '=') `fmap` x of Just ("record",file) -> Right ([ProgressRecord $ if null file then "progress.txt" else tail file], id) Just ("replay",file) -> Right ([ProgressReplay $ if null file then "progress.txt" else tail file], id) _ -> func x outputDebug output Nothing = output outputDebug output (Just file) = \v msg -> do when (v /= Diagnostic) $ output v msg appendFile file $ unescape msg ++ "\n" prog i p = do program <- progressProgram progressDisplay i (\s -> progressTitlebar s >> program s) p shake-0.16.4/src/Development/Shake/Internal/Rules/0000755000000000000000000000000013261223301020026 5ustar0000000000000000shake-0.16.4/src/Development/Shake/Internal/Rules/Rerun.hs0000644000000000000000000000270313261223301021457 0ustar0000000000000000{-# LANGUAGE GeneralizedNewtypeDeriving, DeriveDataTypeable, ScopedTypeVariables #-} {-# LANGUAGE TypeFamilies #-} module Development.Shake.Internal.Rules.Rerun( defaultRuleRerun, alwaysRerun ) where import Development.Shake.Internal.Core.Run import Development.Shake.Internal.Core.Rules import Development.Shake.Internal.Core.Types import Development.Shake.Classes import qualified Data.ByteString as BS import General.Binary newtype AlwaysRerunQ = AlwaysRerunQ () deriving (Typeable,Eq,Hashable,Binary,BinaryEx,NFData) instance Show AlwaysRerunQ where show _ = "alwaysRerun" type instance RuleResult AlwaysRerunQ = () -- | Always rerun the associated action. Useful for defining rules that query -- the environment. For example: -- -- @ -- \"ghcVersion.txt\" 'Development.Shake.%>' \\out -> do -- 'alwaysRerun' -- 'Development.Shake.Stdout' stdout <- 'Development.Shake.cmd' \"ghc --numeric-version\" -- 'Development.Shake.writeFileChanged' out stdout -- @ -- -- In make, the @.PHONY@ attribute on file-producing rules has a similar effect. -- -- Note that 'alwaysRerun' is applied when a rule is executed. Modifying an existing rule -- to insert 'alwaysRerun' will /not/ cause that rule to rerun next time. alwaysRerun :: Action () alwaysRerun = apply1 $ AlwaysRerunQ () defaultRuleRerun :: Rules () defaultRuleRerun = addBuiltinRuleEx noLint $ \AlwaysRerunQ{} _ _ -> return $ RunResult ChangedRecomputeDiff BS.empty () shake-0.16.4/src/Development/Shake/Internal/Rules/OrderOnly.hs0000644000000000000000000000214313261223301022277 0ustar0000000000000000 module Development.Shake.Internal.Rules.OrderOnly( orderOnly, orderOnlyBS ) where import Development.Shake.Internal.Core.Run import Development.Shake.Internal.Rules.File import qualified Data.ByteString.Char8 as BS -- | Define order-only dependencies, these are dependencies that will always -- be built before continuing, but which aren't dependencies of this action. -- Mostly useful for defining generated dependencies you think might be real dependencies. -- If they turn out to be real dependencies, you should add an explicit dependency afterwards. -- -- @ -- \"source.o\" %> \\out -> do -- 'orderOnly' [\"header.h\"] -- 'cmd_' \"gcc -c source.c -o source.o -MMD -MF source.m\" -- 'neededMakefileDependencies' \"source.m\" -- @ -- -- If @header.h@ is included by @source.c@ then the call to 'needMakefileDependencies' will cause -- it to be added as a real dependency. If it isn't, then the rule won't rebuild if it changes. orderOnly :: [FilePath] -> Action () orderOnly = orderOnlyAction . need orderOnlyBS :: [BS.ByteString] -> Action () orderOnlyBS = orderOnlyAction . needBS shake-0.16.4/src/Development/Shake/Internal/Rules/Oracle.hs0000644000000000000000000001351313261223301021572 0ustar0000000000000000{-# LANGUAGE GeneralizedNewtypeDeriving, DeriveDataTypeable, ScopedTypeVariables #-} {-# LANGUAGE TypeFamilies, ConstraintKinds #-} module Development.Shake.Internal.Rules.Oracle( addOracle, addOracleCache, askOracle ) where import Development.Shake.Internal.Core.Run import Development.Shake.Internal.Core.Types import Development.Shake.Internal.Core.Rules import Development.Shake.Internal.Options import Development.Shake.Internal.Value import Development.Shake.Classes import qualified Data.ByteString as BS import qualified Data.ByteString.Lazy as LBS import Data.Binary import Control.Applicative import Prelude -- Use short type names, since the names appear in the Haddock, and are too long if they are in full newtype OracleQ question = OracleQ question deriving (Show,Typeable,Eq,Hashable,Binary,NFData) newtype OracleA answer = OracleA answer deriving (Show,Typeable,Eq,Hashable,Binary,NFData) type instance RuleResult (OracleQ a) = OracleA (RuleResult a) addOracleRaw :: (RuleResult q ~ a, ShakeValue q, ShakeValue a) => Bool -> (q -> Action a) -> Rules (q -> Action a) addOracleRaw cache act = do -- rebuild is automatic for oracles, skip just means we don't rebuild opts <- getShakeOptionsRules let skip = shakeRebuildApply opts "" == RebuildLater addBuiltinRule noLint $ \(OracleQ q) old changed -> case old of Just old | skip || (cache && not changed) -> return $ RunResult ChangedNothing old $ decode' old _ -> do new <- OracleA <$> act q return $ RunResult (if fmap decode' old == Just new then ChangedRecomputeSame else ChangedRecomputeDiff) (encode' new) new return askOracle where encode' :: Binary a => a -> BS.ByteString encode' = BS.concat . LBS.toChunks . encode decode' :: Binary a => BS.ByteString -> a decode' = decode . LBS.fromChunks . return -- | Add extra information which rules can depend on. -- An oracle is a function from a question type @q@, to an answer type @a@. -- As an example, we can define an oracle allowing you to depend on the current version of GHC: -- -- @ -- newtype GhcVersion = GhcVersion () deriving (Show,Typeable,Eq,Hashable,Binary,NFData) -- type instance RuleResult GhcVersion = String -- rules = do -- 'addOracle' $ \\(GhcVersion _) -> fmap 'Development.Shake.fromStdout' $ 'Development.Shake.cmd' \"ghc --numeric-version\" :: Action String -- ... rules ... -- @ -- -- If a rule calls @'askOracle' (GhcVersion ())@, that rule will be rerun whenever the GHC version changes. -- Some notes: -- -- * We define @GhcVersion@ with a @newtype@ around @()@, allowing the use of @GeneralizedNewtypeDeriving@. -- All the necessary type classes are exported from "Development.Shake.Classes". -- -- * The @type instance@ requires the extension @TypeFamilies@. -- -- * Each call to 'addOracle' must use a different type of question. -- -- * Actions passed to 'addOracle' will be run in every build they are required, even if nothing else changes, -- so be careful of slow actions. -- If the result of an oracle does not change it will not invalidate any rules depending on it. -- To always rerun files rules see 'Development.Shake.alwaysRerun'. -- -- * If the value returned by 'askOracle' is ignored then 'askOracleWith' may help avoid ambiguous type messages. -- Alternatively, use the result of 'addOracle', which is 'askOracle' restricted to the correct type. -- -- As a more complex example, consider tracking Haskell package versions: -- -- @ -- newtype GhcPkgList = GhcPkgList () deriving (Show,Typeable,Eq,Hashable,Binary,NFData) -- type instance RuleResult GhcPkgList = [(String, String)] -- newtype GhcPkgVersion = GhcPkgVersion String deriving (Show,Typeable,Eq,Hashable,Binary,NFData) -- type instance RuleResult GhcPkgVersion = Maybe String -- -- rules = do -- getPkgList \<- 'addOracle' $ \\GhcPkgList{} -> do -- Stdout out <- 'Development.Shake.cmd' \"ghc-pkg list --simple-output\" -- return [(reverse b, reverse a) | x <- words out, let (a,_:b) = break (== \'-\') $ reverse x] -- -- getPkgVersion \<- 'addOracle' $ \\(GhcPkgVersion pkg) -> do -- pkgs <- getPkgList $ GhcPkgList () -- return $ lookup pkg pkgs -- -- \"myrule\" %> \\_ -> do -- getPkgVersion $ GhcPkgVersion \"shake\" -- ... rule using the shake version ... -- @ -- -- Using these definitions, any rule depending on the version of @shake@ -- should call @getPkgVersion $ GhcPkgVersion \"shake\"@ to rebuild when @shake@ is upgraded. addOracle :: (RuleResult q ~ a, ShakeValue q, ShakeValue a) => (q -> Action a) -> Rules (q -> Action a) addOracle = addOracleRaw False -- | A combination of 'addOracle' and 'newCache' - an action that only runs when its dependencies change, -- whose result is stored in the database. -- -- * Does the information need recomputing every time? e.g. looking up stuff in the environment? -- If so, use 'addOracle' instead. -- -- * Is the action mostly deserisalising some file? If so, use 'newCache'. -- -- * Is the operation expensive computation from other results? If so, use 'addOracleCache'. -- -- An alternative to using 'addOracleCache' is introducing an intermediate file containing the result, -- which requires less storage in the Shake database and can be inspected by existing file-system viewing -- tools. addOracleCache ::(RuleResult q ~ a, ShakeValue q, ShakeValue a) => (q -> Action a) -> Rules (q -> Action a) addOracleCache = addOracleRaw True -- | Get information previously added with 'addOracle' or 'addOracleCache'. -- The question/answer types must match those provided previously. askOracle :: (RuleResult q ~ a, ShakeValue q, ShakeValue a) => q -> Action a askOracle question = do OracleA answer <- apply1 $ OracleQ question; return answer shake-0.16.4/src/Development/Shake/Internal/Rules/Files.hs0000644000000000000000000002325013261223301021426 0ustar0000000000000000{-# LANGUAGE GeneralizedNewtypeDeriving, DeriveDataTypeable, ScopedTypeVariables #-} {-# LANGUAGE ViewPatterns, TypeFamilies #-} module Development.Shake.Internal.Rules.Files( (&?>), (&%>), defaultRuleFiles ) where import Control.Monad import Control.Monad.IO.Class import Data.Maybe import Data.List.Extra import Control.Applicative import Data.Typeable.Extra import General.Binary import Prelude import Development.Shake.Internal.Errors import Development.Shake.Internal.Core.Action hiding (trackAllow) import Development.Shake.Internal.Core.Run import Development.Shake.Internal.Core.Types import Development.Shake.Internal.Core.Rules import General.Extra import Development.Shake.Internal.FileName import Development.Shake.Classes import Development.Shake.Internal.Rules.Rerun import Development.Shake.Internal.Rules.File import Development.Shake.Internal.FilePattern import Development.Shake.FilePath import Development.Shake.Internal.Options infix 1 &?>, &%> type instance RuleResult FilesQ = FilesA newtype FilesQ = FilesQ {fromFilesQ :: [FileQ]} deriving (Typeable,Eq,Hashable,Binary,BinaryEx,NFData) newtype FilesA = FilesA [FileA] deriving (Typeable,BinaryEx,NFData) instance Show FilesA where show (FilesA xs) = unwords $ "Files" : map (drop 5 . show) xs instance Show FilesQ where show (FilesQ xs) = unwords $ map (wrapQuote . show) xs filesStoredValue :: ShakeOptions -> FilesQ -> IO (Maybe FilesA) filesStoredValue opts (FilesQ xs) = fmap FilesA . sequence <$> mapM (fileStoredValue opts) xs filesEqualValue :: ShakeOptions -> FilesA -> FilesA -> EqualCost filesEqualValue opts (FilesA xs) (FilesA ys) | length xs /= length ys = NotEqual | otherwise = foldr and_ EqualCheap $ zipWith (fileEqualValue opts) xs ys where and_ NotEqual x = NotEqual and_ EqualCheap x = x and_ EqualExpensive x = if x == NotEqual then NotEqual else EqualExpensive defaultRuleFiles :: Rules () defaultRuleFiles = do opts <- getShakeOptionsRules -- A rule from FilesQ to FilesA. The result value is only useful for linting. addBuiltinRuleEx (ruleLint opts) (ruleRun opts $ shakeRebuildApply opts) ruleLint :: ShakeOptions -> BuiltinLint FilesQ FilesA ruleLint opts k (FilesA []) = return Nothing -- in the case of disabling lint ruleLint opts k v = do now <- filesStoredValue opts k return $ case now of Nothing -> Just "" Just now | filesEqualValue opts v now == EqualCheap -> Nothing | otherwise -> Just $ show now ruleRun :: ShakeOptions -> (FilePath -> Rebuild) -> BuiltinRun FilesQ FilesA ruleRun opts rebuildFlags k o@(fmap getEx -> old) dirtyChildren = do let r = map (rebuildFlags . fileNameToString . fromFileQ) $ fromFilesQ k case old of _ | RebuildNow `elem` r -> rebuild _ | RebuildLater `elem` r -> case old of Just old -> -- ignoring the currently stored value, which may trigger lint has changed -- so disable lint on this file return $ RunResult ChangedNothing (fromJust o) $ FilesA [] Nothing -> do -- i don't have a previous value, so assume this is a source node, and mark rebuild in future now <- liftIO $ filesStoredValue opts k case now of Nothing -> rebuild Just now -> do alwaysRerun; return $ RunResult ChangedStore (runBuilder $ putEx now) now Just old | not dirtyChildren -> do v <- liftIO $ filesStoredValue opts k case v of Just v -> case filesEqualValue opts old v of NotEqual -> rebuild EqualCheap -> return $ RunResult ChangedNothing (fromJust o) v EqualExpensive -> return $ RunResult ChangedStore (runBuilder $ putEx v) v Nothing -> rebuild _ -> rebuild where rebuild = do putWhen Chatty $ "# " ++ show k rules :: UserRule (FilesQ -> Maybe (Action FilesA)) <- getUserRules v <- case userRuleMatch rules ($ k) of [r] -> r rs -> liftIO $ errorMultipleRulesMatch (typeOf k) (show k) (length rs) let c | Just old <- old, filesEqualValue opts old v /= NotEqual = ChangedRecomputeSame | otherwise = ChangedRecomputeDiff return $ RunResult c (runBuilder $ putEx v) v -- | Define a rule for building multiple files at the same time. -- Think of it as the AND (@&&@) equivalent of '%>'. -- As an example, a single invocation of GHC produces both @.hi@ and @.o@ files: -- -- @ -- [\"*.o\",\"*.hi\"] '&%>' \\[o,hi] -> do -- let hs = o 'Development.Shake.FilePath.-<.>' \"hs\" -- 'Development.Shake.need' ... -- all files the .hs import -- 'Development.Shake.cmd' \"ghc -c\" [hs] -- @ -- -- However, in practice, it's usually easier to define rules with '%>' and make the @.hi@ depend -- on the @.o@. When defining rules that build multiple files, all the 'FilePattern' values must -- have the same sequence of @\/\/@ and @*@ wildcards in the same order. -- This function will create directories for the result files, if necessary. (&%>) :: [FilePattern] -> ([FilePath] -> Action ()) -> Rules () [p] &%> act = p %> act . return ps &%> act | not $ compatible ps = error $ unlines $ "All patterns to &%> must have the same number and position of // and * wildcards" : ["* " ++ p ++ (if compatible [p, head ps] then "" else " (incompatible)") | p <- ps] | otherwise = do forM_ (zip [0..] ps) $ \(i,p) -> (if simple p then id else priority 0.5) $ fileForward $ let op = (p ?==) in \file -> if not $ op file then Nothing else Just $ do FilesA res <- apply1 $ FilesQ $ map (FileQ . fileNameFromString . substitute (extract p file)) ps return $ if null res then Nothing else Just $ res !! i (if all simple ps then id else priority 0.5) $ addUserRule $ \(FilesQ xs_) -> let xs = map (fileNameToString . fromFileQ) xs_ in if not $ length xs == length ps && and (zipWith (?==) ps xs) then Nothing else Just $ do liftIO $ mapM_ createDirectoryRecursive $ nubOrd $ map takeDirectory xs trackAllow xs act xs getFileTimes "&%>" xs_ -- | Define a rule for building multiple files at the same time, a more powerful -- and more dangerous version of '&%>'. Think of it as the AND (@&&@) equivalent of '?>'. -- -- Given an application @test &?> ...@, @test@ should return @Just@ if the rule applies, and should -- return the list of files that will be produced. This list /must/ include the file passed as an argument and should -- obey the invariant: -- -- > forAll $ \x ys -> test x == Just ys ==> x `elem` ys && all ((== Just ys) . test) ys -- -- As an example of a function satisfying the invariaint: -- -- @ -- test x | 'Development.Shake.FilePath.takeExtension' x \`elem\` [\".hi\",\".o\"] -- = Just ['Development.Shake.FilePath.dropExtension' x 'Development.Shake.FilePath.<.>' \"hi\", 'Development.Shake.FilePath.dropExtension' x 'Development.Shake.FilePath.<.>' \"o\"] -- test _ = Nothing -- @ -- -- Regardless of whether @Foo.hi@ or @Foo.o@ is passed, the function always returns @[Foo.hi, Foo.o]@. (&?>) :: (FilePath -> Maybe [FilePath]) -> ([FilePath] -> Action ()) -> Rules () (&?>) test act = priority 0.5 $ do let inputOutput suf inp out = ["Input" ++ suf ++ ":", " " ++ inp] ++ ["Output" ++ suf ++ ":"] ++ map (" "++) out let normTest = fmap (map $ toStandard . normaliseEx) . test let checkedTest x = case normTest x of Nothing -> Nothing Just ys | x `notElem` ys -> error $ unlines $ "Invariant broken in &?>, did not return the input (after normalisation)." : inputOutput "" x ys Just ys | bad:_ <- filter ((/= Just ys) . normTest) ys -> error $ unlines $ ["Invariant broken in &?>, not equalValue for all arguments (after normalisation)."] ++ inputOutput "1" x ys ++ inputOutput "2" bad (fromMaybe ["Nothing"] $ normTest bad) Just ys -> Just ys fileForward $ \x -> case checkedTest x of Nothing -> Nothing Just ys -> Just $ do FilesA res <- apply1 $ FilesQ $ map (FileQ . fileNameFromString) ys return $ if null res then Nothing else Just $ res !! fromJust (elemIndex x ys) addUserRule $ \(FilesQ xs_) -> let xs@(x:_) = map (fileNameToString . fromFileQ) xs_ in case checkedTest x of Just ys | ys == xs -> Just $ do liftIO $ mapM_ createDirectoryRecursive $ nubOrd $ map takeDirectory xs act xs getFileTimes "&?>" xs_ Just ys -> error $ "Error, &?> is incompatible with " ++ show xs ++ " vs " ++ show ys Nothing -> Nothing getFileTimes :: String -> [FileQ] -> Action FilesA getFileTimes name xs = do opts <- getShakeOptions let opts2 = if shakeChange opts == ChangeModtimeAndDigestInput then opts{shakeChange=ChangeModtime} else opts ys <- liftIO $ mapM (fileStoredValue opts2) xs case sequence ys of Just ys -> return $ FilesA ys Nothing | not $ shakeCreationCheck opts -> return $ FilesA [] Nothing -> do let missing = length $ filter isNothing ys error $ "Error, " ++ name ++ " rule failed to produce " ++ show missing ++ " file" ++ (if missing == 1 then "" else "s") ++ " (out of " ++ show (length xs) ++ ")" ++ concat ["\n " ++ fileNameToString x ++ if isNothing y then " - MISSING" else "" | (FileQ x,y) <- zip xs ys] shake-0.16.4/src/Development/Shake/Internal/Rules/File.hs0000644000000000000000000005465013261223301021253 0ustar0000000000000000{-# LANGUAGE GeneralizedNewtypeDeriving, DeriveDataTypeable, ScopedTypeVariables #-} {-# LANGUAGE ViewPatterns, RecordWildCards, FlexibleInstances, TypeFamilies #-} module Development.Shake.Internal.Rules.File( need, needHasChanged, needBS, needed, neededBS, want, trackRead, trackWrite, trackAllow, defaultRuleFile, (%>), (|%>), (?>), phony, (~>), phonys, resultHasChanged, -- * Internal only FileQ(..), FileA, fileStoredValue, fileEqualValue, EqualCost(..), fileForward ) where import Control.Applicative import Control.Monad.Extra import Control.Monad.IO.Class import Data.Typeable import Data.List import Data.Maybe import qualified Data.ByteString.Char8 as BS import qualified Data.HashSet as Set import Foreign.Storable import Data.Word import Data.Monoid import General.Binary import General.Extra import Development.Shake.Internal.Core.Types import Development.Shake.Internal.Core.Rules import Development.Shake.Internal.Core.Run import Development.Shake.Internal.Core.Action hiding (trackAllow) import qualified Development.Shake.Internal.Core.Action as S import Development.Shake.Internal.FileName import Development.Shake.Internal.Rules.Rerun import Development.Shake.Classes import Development.Shake.FilePath(toStandard) import Development.Shake.Internal.FilePattern import Development.Shake.Internal.FileInfo import Development.Shake.Internal.Options import Development.Shake.Internal.Errors import System.FilePath(takeDirectory) -- important that this is the system local filepath, or wrong slashes go wrong import System.IO.Unsafe(unsafeInterleaveIO) import Prelude infix 1 %>, ?>, |%>, ~> --------------------------------------------------------------------- -- TYPES type instance RuleResult FileQ = FileR -- | The unique key we use to index File rules, to avoid name clashes. newtype FileQ = FileQ {fromFileQ :: FileName} deriving (Typeable,Eq,Hashable,Binary,BinaryEx,NFData) -- | Raw information about a file. data FileA = FileA {-# UNPACK #-} !ModTime {-# UNPACK #-} !FileSize FileHash deriving (Typeable) -- | Result of a File rule, may contain raw file information and whether the rule did run this build data FileR = FileR { result :: Maybe FileA -- ^ Raw information about the file built by this rule. -- Set to 'Nothing' to prevent linting some times. , hasChanged :: Bool -- ^ Whether the file changed this build. Transient -- information, that doesn't get serialized. } deriving (Typeable) -- | The types of file rule that occur. data Mode = ModePhony (Action ()) -- ^ An action with no file value | ModeDirect (Action ()) -- ^ An action that produces this file | ModeForward (Action (Maybe FileA)) -- ^ An action that looks up a file someone else produced -- | The results of the various 'Mode' rules. data Result = ResultPhony | ResultDirect FileA | ResultForward FileA -- | The use rules we use. newtype FileRule = FileRule (FilePath -> Maybe Mode) deriving Typeable --------------------------------------------------------------------- -- INSTANCES instance Show FileQ where show (FileQ x) = fileNameToString x instance BinaryEx [FileQ] where putEx = putEx . map fromFileQ getEx = map FileQ . getEx instance NFData FileA where rnf (FileA a b c) = rnf a `seq` rnf b `seq` rnf c instance NFData FileR where rnf (FileR f b) = rnf f `seq` rnf b instance Show FileA where show (FileA m s h) = "File {mod=" ++ show m ++ ",size=" ++ show s ++ ",digest=" ++ show h ++ "}" instance Show FileR where show FileR{..} = show result ++ if hasChanged then " recomputed" else " not recomputed" instance Storable FileA where sizeOf _ = 4 * 3 -- 4 Word32's alignment _ = alignment (undefined :: ModTime) peekByteOff p i = FileA <$> peekByteOff p i <*> peekByteOff p (i+4) <*> peekByteOff p (i+8) pokeByteOff p i (FileA a b c) = pokeByteOff p i a >> pokeByteOff p (i+4) b >> pokeByteOff p (i+8) c instance BinaryEx FileA where putEx = putExStorable getEx = getExStorable instance BinaryEx [FileA] where putEx = putExStorableList getEx = getExStorableList fromResult :: Result -> Maybe FileA fromResult ResultPhony = Nothing fromResult (ResultDirect x) = Just x fromResult (ResultForward x) = Just x instance BinaryEx Result where putEx ResultPhony = mempty putEx (ResultDirect x) = putEx x putEx (ResultForward x) = putEx (0 :: Word8) <> putEx x getEx x = case BS.length x of 0 -> ResultPhony 12 -> ResultDirect $ getEx x 13 -> ResultForward $ getEx $ BS.tail x --------------------------------------------------------------------- -- FILE CHECK QUERIES -- | An equality check and a cost. data EqualCost = EqualCheap -- ^ The equality check was cheap. | EqualExpensive -- ^ The equality check was expensive, as the results are not trivially equal. | NotEqual -- ^ The values are not equal. deriving (Eq,Ord,Show,Read,Typeable,Enum,Bounded) fileStoredValue :: ShakeOptions -> FileQ -> IO (Maybe FileA) fileStoredValue ShakeOptions{shakeChange=c} (FileQ x) = do res <- getFileInfo x case res of Nothing -> return Nothing Just (time,size) | c == ChangeModtime -> return $ Just $ FileA time size fileInfoNoHash Just (time,size) -> do hash <- unsafeInterleaveIO $ getFileHash x return $ Just $ FileA time size hash fileEqualValue :: ShakeOptions -> FileA -> FileA -> EqualCost fileEqualValue ShakeOptions{shakeChange=c} (FileA x1 x2 x3) (FileA y1 y2 y3) = case c of ChangeModtime -> bool $ x1 == y1 ChangeDigest -> bool $ x2 == y2 && x3 == y3 ChangeModtimeOrDigest -> bool $ x1 == y1 && x2 == y2 && x3 == y3 _ | x1 == y1 -> EqualCheap | x2 == y2 && x3 == y3 -> EqualExpensive | otherwise -> NotEqual where bool b = if b then EqualCheap else NotEqual -- | Arguments: options; is the file an input; a message for failure if the file does not exist; filename storedValueError :: ShakeOptions -> Bool -> String -> FileQ -> IO (Maybe FileA) {- storedValueError opts False msg x | False && not (shakeOutputCheck opts) = do when (shakeCreationCheck opts) $ do whenM (isNothing <$> (storedValue opts x :: IO (Maybe FileA))) $ error $ msg ++ "\n " ++ unpackU (fromFileQ x) return $ FileA fileInfoEq fileInfoEq fileInfoEq -} storedValueError opts input msg x = maybe def Just <$> fileStoredValue opts2 x where def = if shakeCreationCheck opts || input then error err else Nothing err = msg ++ "\n " ++ fileNameToString (fromFileQ x) opts2 = if not input && shakeChange opts == ChangeModtimeAndDigestInput then opts{shakeChange=ChangeModtime} else opts --------------------------------------------------------------------- -- THE DEFAULT RULE defaultRuleFile :: Rules () defaultRuleFile = do opts@ShakeOptions{..} <- getShakeOptionsRules -- A rule from FileQ to (Maybe FileA). The result value is only useful for linting. addBuiltinRuleEx (ruleLint opts) (ruleRun opts $ shakeRebuildApply opts) ruleLint :: ShakeOptions -> BuiltinLint FileQ FileR ruleLint opts k (FileR Nothing _) = return Nothing ruleLint opts k (FileR (Just v) _) = do now <- fileStoredValue opts k return $ case now of Nothing -> Just "" Just now | fileEqualValue opts v now == EqualCheap -> Nothing | otherwise -> Just $ show now ruleRun :: ShakeOptions -> (FilePath -> Rebuild) -> BuiltinRun FileQ FileR ruleRun opts@ShakeOptions{..} rebuildFlags o@(FileQ x) oldBin@(fmap getEx -> old) dirtyChildren = do -- for One, rebuild makes perfect sense -- for Forward, we expect the child will have already rebuilt - Rebuild just lets us deal with code changes -- for Phony, it doesn't make that much sense, but probably isn't harmful? let r = rebuildFlags $ fileNameToString x case old of _ | r == RebuildNow -> rebuild _ | r == RebuildLater -> case old of Just old -> -- ignoring the currently stored value, which may trigger lint has changed -- so disable lint on this file unLint <$> retOld ChangedNothing Nothing -> do -- i don't have a previous value, so assume this is a source node, and mark rebuild in future now <- liftIO $ fileStoredValue opts o case now of Nothing -> rebuild Just now -> do alwaysRerun; retNew ChangedStore $ ResultDirect now {- _ | r == RebuildNever -> do now <- liftIO $ fileStoredValue opts o case now of Nothing -> rebuild Just now -> do let diff | Just (ResultDirect old) <- old, fileEqualValue opts old now /= NotEqual = ChangedRecomputeSame | otherwise = ChangedRecomputeDiff retNew diff $ ResultDirect now -} Just (ResultDirect old) | not dirtyChildren -> do now <- liftIO $ fileStoredValue opts o case now of Nothing -> rebuild Just now -> case fileEqualValue opts old now of EqualCheap -> retNew ChangedNothing $ ResultDirect now EqualExpensive -> retNew ChangedStore $ ResultDirect now NotEqual -> rebuild Just (ResultForward old) | not dirtyChildren -> retOld ChangedNothing _ -> rebuild where -- no need to lint check forward files -- but more than that, it goes wrong if you do, see #427 asLint (ResultDirect x) = Just x asLint x = Nothing unLint (RunResult a b (FileR _ c)) = RunResult a b $ FileR Nothing c retNew :: RunChanged -> Result -> Action (RunResult FileR) retNew c v = return $ RunResult c (runBuilder $ putEx v) (FileR (asLint v) (c == ChangedRecomputeDiff)) retOld :: RunChanged -> Action (RunResult FileR) retOld c = return $ RunResult c (fromJust oldBin) $ FileR (asLint $ fromJust old) False -- actually run the rebuild rebuild = do putWhen Chatty $ "# " ++ show o x <- return $ fileNameToString x rules <- getUserRules act <- case userRuleMatch rules $ \(FileRule f) -> f x of [] -> return Nothing [r] -> return $ Just r rs -> liftIO $ errorMultipleRulesMatch (typeOf o) (show o) (length rs) let answer ctor new = do let b = case () of _ | Just old <- old , Just old <- fromResult old , fileEqualValue opts old new /= NotEqual -> ChangedRecomputeSame _ -> ChangedRecomputeDiff retNew b $ ctor new case act of Nothing -> do new <- liftIO $ storedValueError opts True "Error, file does not exist and no rule available:" o answer ResultDirect $ fromJust new Just (ModeForward act) -> do new <- act case new of Nothing -> do alwaysRerun retNew ChangedRecomputeDiff ResultPhony Just new -> answer ResultForward new Just (ModeDirect act) -> do act new <- liftIO $ storedValueError opts False "Error, rule finished running but did not produce file:" o case new of Nothing -> retNew ChangedRecomputeDiff ResultPhony Just new -> answer ResultDirect new Just (ModePhony act) -> do -- See #523 and #524 -- Shake runs the dependencies first, but stops when one has changed. -- We don't want to run the existing deps first if someone changes the build system, -- so insert a fake dependency that cuts the process dead. alwaysRerun act retNew ChangedRecomputeDiff ResultPhony apply_ :: (a -> FileName) -> [a] -> Action [FileR] apply_ f = apply . map (FileQ . f) -- | Has a file changed. This function will only give the correct answer if called in the rule -- producing the file, /before/ the rule has modified the file in question. -- Best avoided, but sometimes necessary in conjunction with 'needHasChanged' to cause rebuilds -- to happen if the result is deleted or modified. resultHasChanged :: FilePath -> Action Bool resultHasChanged file = do let filename = FileQ $ fileNameFromString file res <- getDatabaseValue filename old <- return $ case res of Nothing -> Nothing Just (Left bs) -> fromResult $ getEx bs Just (Right v) -> result v case old of Nothing -> return True Just old -> do opts <- getShakeOptions new <- liftIO $ fileStoredValue opts filename return $ case new of Nothing -> True Just new -> fileEqualValue opts old new == NotEqual --------------------------------------------------------------------- -- OPTIONS ON TOP -- | Internal method for adding forwarding actions fileForward :: (FilePath -> Maybe (Action (Maybe FileA))) -> Rules () fileForward act = addUserRule $ FileRule $ fmap ModeForward . act -- | Add a dependency on the file arguments, ensuring they are built before continuing. -- The file arguments may be built in parallel, in any order. This function is particularly -- necessary when calling 'Development.Shake.cmd' or 'Development.Shake.command'. As an example: -- -- @ -- \"\/\/*.rot13\" '%>' \\out -> do -- let src = 'Development.Shake.FilePath.dropExtension' out -- 'need' [src] -- 'Development.Shake.cmd' \"rot13\" [src] \"-o\" [out] -- @ -- -- Usually @need [foo,bar]@ is preferable to @need [foo] >> need [bar]@ as the former allows greater -- parallelism, while the latter requires @foo@ to finish building before starting to build @bar@. -- -- This function should not be called with wildcards (e.g. @*.txt@ - use 'getDirectoryFiles' to expand them), -- environment variables (e.g. @$HOME@ - use 'getEnv' to expand them) or directories (directories cannot be -- tracked directly - track files within the directory instead). need :: [FilePath] -> Action () need = void . apply_ fileNameFromString -- | Like 'need' but returns a list of rebuild dependencies this build. -- -- The following example writes a list of changed dependencies to a file as its action. -- -- @ -- \"target\" '%>' \\out -> do -- let sourceList = [\"source1\", \"source2\"] -- rebuildList <- 'needHasChanged' sourceList -- 'Development.Shake.writeFileLines' out rebuildList -- @ -- -- This function can be used to alter the action depending on which dependency needed -- to be rebuild. -- -- Note that a rule can be run even if no dependency has changed, for example -- because of 'shakeRebuild' or because the target has changed or been deleted. -- To detect the latter case you may wish to use 'resultHasChanged'. needHasChanged :: [FilePath] -> Action [FilePath] needHasChanged paths = do res <- apply_ fileNameFromString paths return [a | (a,b) <- zip paths res, hasChanged b] needBS :: [BS.ByteString] -> Action () needBS = void . apply_ fileNameFromByteString -- | Like 'need', but if 'shakeLint' is set, check that the file does not rebuild. -- Used for adding dependencies on files that have already been used in this rule. needed :: [FilePath] -> Action () needed xs = do opts <- getShakeOptions if isNothing $ shakeLint opts then need xs else neededCheck $ map fileNameFromString xs neededBS :: [BS.ByteString] -> Action () neededBS xs = do opts <- getShakeOptions if isNothing $ shakeLint opts then needBS xs else neededCheck $ map fileNameFromByteString xs neededCheck :: [FileName] -> Action () neededCheck xs = do opts <- getShakeOptions pre <- liftIO $ mapM (fileStoredValue opts . FileQ) xs post <- apply_ id xs let bad = [ (x, if isJust a then "File change" else "File created") | (x, a, FileR (Just b) _) <- zip3 xs pre post, maybe NotEqual (\a -> fileEqualValue opts a b) a == NotEqual] case bad of [] -> return () (file,msg):_ -> liftIO $ errorStructured "Lint checking error - 'needed' file required rebuilding" [("File", Just $ fileNameToString file) ,("Error",Just msg)] "" -- | Track that a file was read by the action preceeding it. If 'shakeLint' is activated -- then these files must be dependencies of this rule. Calls to 'trackRead' are -- automatically inserted in 'LintFSATrace' mode. trackRead :: [FilePath] -> Action () trackRead = mapM_ (trackUse . FileQ . fileNameFromString) -- | Track that a file was written by the action preceeding it. If 'shakeLint' is activated -- then these files must either be the target of this rule, or never referred to by the build system. -- Calls to 'trackWrite' are automatically inserted in 'LintFSATrace' mode. trackWrite :: [FilePath] -> Action () trackWrite = mapM_ (trackChange . FileQ . fileNameFromString) -- | Allow accessing a file in this rule, ignoring any 'trackRead' \/ 'trackWrite' calls matching -- the pattern. trackAllow :: [FilePattern] -> Action () trackAllow ps = do opts <- getShakeOptions when (isJust $ shakeLint opts) $ S.trackAllow $ \(FileQ x) -> any (?== fileNameToString x) ps -- | Require that the argument files are built by the rules, used to specify the target. -- -- @ -- main = 'Development.Shake.shake' 'shakeOptions' $ do -- 'want' [\"Main.exe\"] -- ... -- @ -- -- This program will build @Main.exe@, given sufficient rules. All arguments to all 'want' calls -- may be built in parallel, in any order. -- -- This function is defined in terms of 'action' and 'need', use 'action' if you need more complex -- targets than 'want' allows. want :: [FilePath] -> Rules () want [] = return () want xs = action $ need xs root :: String -> (FilePath -> Bool) -> (FilePath -> Action ()) -> Rules () root help test act = addUserRule $ FileRule $ \x -> if not $ test x then Nothing else Just $ ModeDirect $ do liftIO $ createDirectoryRecursive $ takeDirectory x act x -- | Declare a Make-style phony action. A phony target does not name -- a file (despite living in the same namespace as file rules); -- rather, it names some action to be executed when explicitly -- requested. You can demand 'phony' rules using 'want'. (And 'need', -- although that's not recommended.) -- -- Phony actions are intended to define recipes that can be executed -- by the user. If you 'need' a phony action in a rule then every -- execution where that rule is required will rerun both the rule and -- the phony action. However, note that phony actions are never -- executed more than once in a single build run. -- -- In make, the @.PHONY@ attribute on non-file-producing rules has a -- similar effect. However, while in make it is acceptable to omit -- the @.PHONY@ attribute as long as you don't create the file in -- question, a Shake rule which behaves this way will fail lint. -- Use a phony rule! For file-producing rules which should be -- rerun every execution of Shake, see 'Development.Shake.alwaysRerun'. phony :: String -> Action () -> Rules () phony (toStandard -> name) act = phonys $ \s -> if s == name then Just act else Nothing -- | A predicate version of 'phony', return 'Just' with the 'Action' for the matching rules. phonys :: (String -> Maybe (Action ())) -> Rules () phonys act = addUserRule $ FileRule $ fmap ModePhony . act -- | Infix operator alias for 'phony', for sake of consistency with normal -- rules. (~>) :: String -> Action () -> Rules () (~>) = phony -- | Define a rule to build files. If the first argument returns 'True' for a given file, -- the second argument will be used to build it. Usually '%>' is sufficient, but '?>' gives -- additional power. For any file used by the build system, only one rule should return 'True'. -- This function will create the directory for the result file, if necessary. -- -- @ -- (all isUpper . 'Development.Shake.FilePath.takeBaseName') '?>' \\out -> do -- let src = 'Development.Shake.FilePath.replaceBaseName' out $ map toLower $ takeBaseName out -- 'Development.Shake.writeFile'' out . map toUpper =<< 'Development.Shake.readFile'' src -- @ -- -- If the 'Action' completes successfully the file is considered up-to-date, even if the file -- has not changed. (?>) :: (FilePath -> Bool) -> (FilePath -> Action ()) -> Rules () (?>) test act = priority 0.5 $ root "with ?>" test act -- | Define a set of patterns, and if any of them match, run the associated rule. Defined in terms of '%>'. -- Think of it as the OR (@||@) equivalent of '%>'. (|%>) :: [FilePattern] -> (FilePath -> Action ()) -> Rules () (|%>) pats act = do let (simp,other) = partition simple pats case simp of [] -> return () [p] -> let pp = toStandard p in root "with |%>" (\x -> toStandard x == pp) act ps -> let ps = Set.fromList $ map toStandard pats in root "with |%>" (flip Set.member ps . toStandard) act unless (null other) $ let ps = map (?==) other in priority 0.5 $ root "with |%>" (\x -> any ($ x) ps) act -- | Define a rule that matches a 'FilePattern', see '?==' for the pattern rules. -- Patterns with no wildcards have higher priority than those with wildcards, and no file -- required by the system may be matched by more than one pattern at the same priority -- (see 'priority' and 'alternatives' to modify this behaviour). -- This function will create the directory for the result file, if necessary. -- -- @ -- \"*.asm.o\" '%>' \\out -> do -- let src = 'Development.Shake.FilePath.dropExtension' out -- 'need' [src] -- 'Development.Shake.cmd' \"as\" [src] \"-o\" [out] -- @ -- -- To define a build system for multiple compiled languages, we recommend using @.asm.o@, -- @.cpp.o@, @.hs.o@, to indicate which language produces an object file. -- I.e., the file @foo.cpp@ produces object file @foo.cpp.o@. -- -- Note that matching is case-sensitive, even on Windows. -- -- If the 'Action' completes successfully the file is considered up-to-date, even if the file -- has not changed. (%>) :: FilePattern -> (FilePath -> Action ()) -> Rules () (%>) test act = (if simple test then id else priority 0.5) $ root (show test) (test ?==) act shake-0.16.4/src/Development/Shake/Internal/Rules/Directory.hs0000644000000000000000000003353713261223301022341 0ustar0000000000000000{-# LANGUAGE GeneralizedNewtypeDeriving, ScopedTypeVariables, DeriveDataTypeable #-} {-# LANGUAGE TypeFamilies, ConstraintKinds #-} -- | Both System.Directory and System.Environment wrappers module Development.Shake.Internal.Rules.Directory( doesFileExist, doesDirectoryExist, getDirectoryContents, getDirectoryFiles, getDirectoryDirs, getEnv, getEnvWithDefault, removeFiles, removeFilesAfter, getDirectoryFilesIO, defaultRuleDirectory ) where import Control.Applicative import Control.Monad.Extra import Control.Monad.IO.Class import Data.Maybe import Data.Binary import Data.List import Data.Tuple.Extra import qualified Data.HashSet as Set import qualified System.Directory as IO import qualified System.Environment.Extra as IO import Development.Shake.Internal.Core.Types import Development.Shake.Internal.Core.Run import Development.Shake.Internal.Core.Rules import Development.Shake.Internal.Value import Development.Shake.Classes import Development.Shake.FilePath import Development.Shake.Internal.FilePattern import General.Extra import General.Binary import Prelude --------------------------------------------------------------------- -- KEY/VALUE TYPES type instance RuleResult DoesFileExistQ = DoesFileExistA newtype DoesFileExistQ = DoesFileExistQ FilePath deriving (Typeable,Eq,Hashable,Binary,BinaryEx,NFData) instance Show DoesFileExistQ where show (DoesFileExistQ a) = "doesFileExist " ++ wrapQuote a newtype DoesFileExistA = DoesFileExistA {fromDoesFileExistA :: Bool} deriving (Typeable,Eq,BinaryEx,NFData) instance Show DoesFileExistA where show (DoesFileExistA a) = show a type instance RuleResult DoesDirectoryExistQ = DoesDirectoryExistA newtype DoesDirectoryExistQ = DoesDirectoryExistQ FilePath deriving (Typeable,Eq,Hashable,Binary,BinaryEx,NFData) instance Show DoesDirectoryExistQ where show (DoesDirectoryExistQ a) = "doesDirectoryExist " ++ wrapQuote a newtype DoesDirectoryExistA = DoesDirectoryExistA {fromDoesDirectoryExistA :: Bool} deriving (Typeable,Eq,BinaryEx,NFData) instance Show DoesDirectoryExistA where show (DoesDirectoryExistA a) = show a type instance RuleResult GetEnvQ = GetEnvA newtype GetEnvQ = GetEnvQ String deriving (Typeable,Eq,Hashable,Binary,BinaryEx,NFData) instance Show GetEnvQ where show (GetEnvQ a) = "getEnv " ++ wrapQuote a newtype GetEnvA = GetEnvA {fromGetEnvA :: Maybe String} deriving (Typeable,Eq,Hashable,BinaryEx,NFData) instance Show GetEnvA where show (GetEnvA a) = maybe "" wrapQuote a type instance RuleResult GetDirectoryContentsQ = GetDirectoryA type instance RuleResult GetDirectoryFilesQ = GetDirectoryA type instance RuleResult GetDirectoryDirsQ = GetDirectoryA newtype GetDirectoryContentsQ = GetDirectoryContentsQ FilePath deriving (Typeable,Eq,Hashable,Binary,BinaryEx,NFData) instance Show GetDirectoryContentsQ where show (GetDirectoryContentsQ dir) = "getDirectoryContents " ++ wrapQuote dir newtype GetDirectoryFilesQ = GetDirectoryFilesQ (FilePath, [FilePattern]) deriving (Typeable,Eq,Hashable,Binary,BinaryEx,NFData) instance Show GetDirectoryFilesQ where show (GetDirectoryFilesQ (dir, pat)) = "getDirectoryFiles " ++ wrapQuote dir ++ " [" ++ unwords (map wrapQuote pat) ++ "]" newtype GetDirectoryDirsQ = GetDirectoryDirsQ FilePath deriving (Typeable,Eq,Hashable,Binary,BinaryEx,NFData) instance Show GetDirectoryDirsQ where show (GetDirectoryDirsQ dir) = "getDirectoryDirs " ++ wrapQuote dir newtype GetDirectoryA = GetDirectoryA {fromGetDirectoryA :: [FilePath]} deriving (Typeable,Eq,Hashable,BinaryEx,NFData) instance Show GetDirectoryA where show (GetDirectoryA xs) = unwords $ map wrapQuote xs --------------------------------------------------------------------- -- RULE DEFINITIONS queryRule :: (RuleResult key ~ value ,BinaryEx witness, Eq witness ,BinaryEx key, ShakeValue key ,Typeable value, NFData value, Show value, Eq value) => (value -> witness) -> (key -> IO value) -> Rules () queryRule witness query = addBuiltinRuleEx (\k old -> do new <- query k return $ if old == new then Nothing else Just $ show new) (\k old _ -> liftIO $ do new <- query k let wnew = witness new return $ case old of Just old | wnew == getEx old -> RunResult ChangedNothing old new _ -> RunResult ChangedRecomputeDiff (runBuilder $ putEx wnew) new) defaultRuleDirectory :: Rules () defaultRuleDirectory = do -- for things we are always going to rerun, and which might take up a lot of memory to store, -- we only store their hash, so we can compute change, but not know what changed happened queryRule id (\(DoesFileExistQ x) -> DoesFileExistA <$> IO.doesFileExist x) queryRule id (\(DoesDirectoryExistQ x) -> DoesDirectoryExistA <$> IO.doesDirectoryExist x) queryRule hash (\(GetEnvQ x) -> GetEnvA <$> IO.lookupEnv x) queryRule hash (\(GetDirectoryContentsQ x) -> GetDirectoryA <$> getDirectoryContentsIO x) queryRule hash (\(GetDirectoryFilesQ (a,b)) -> GetDirectoryA <$> getDirectoryFilesIO a b) queryRule hash (\(GetDirectoryDirsQ x) -> GetDirectoryA <$> getDirectoryDirsIO x) --------------------------------------------------------------------- -- RULE ENTRY POINTS -- | Returns 'True' if the file exists. The existence of the file is tracked as a -- dependency, and if the file is created or deleted the rule will rerun in subsequent builds. -- -- You should not call 'doesFileExist' on files which can be created by the build system. doesFileExist :: FilePath -> Action Bool doesFileExist = fmap fromDoesFileExistA . apply1 . DoesFileExistQ . toStandard -- | Returns 'True' if the directory exists. The existence of the directory is tracked as a -- dependency, and if the directory is created or delete the rule will rerun in subsequent builds. -- -- You should not call 'doesDirectoryExist' on directories which can be created by the build system. doesDirectoryExist :: FilePath -> Action Bool doesDirectoryExist = fmap fromDoesDirectoryExistA . apply1 . DoesDirectoryExistQ . toStandard -- | Return 'Just' the value of the environment variable, or 'Nothing' -- if the variable is not set. The environment variable is tracked as a -- dependency, and if it changes the rule will rerun in subsequent builds. -- This function is a tracked version of 'getEnv' / 'lookupEnv' from the base library. -- -- @ -- flags <- getEnv \"CFLAGS\" -- 'cmd' \"gcc -c\" [out] (maybe [] words flags) -- @ getEnv :: String -> Action (Maybe String) getEnv = fmap fromGetEnvA . apply1 . GetEnvQ -- | @'getEnvWithDefault' def var@ returns the value of the environment variable @var@, or the -- default value @def@ if it is not set. Similar to 'getEnv'. -- -- @ -- flags <- getEnvWithDefault \"-Wall\" \"CFLAGS\" -- 'cmd' \"gcc -c\" [out] flags -- @ getEnvWithDefault :: String -> String -> Action String getEnvWithDefault def var = fromMaybe def <$> getEnv var -- | Get the contents of a directory. The result will be sorted, and will not contain -- the entries @.@ or @..@ (unlike the standard Haskell version). -- The resulting paths will be relative to the first argument. -- The result itself is tracked as a dependency, but the files in the result are not. -- If the list of files changes in subsequent builds any rule calling it will rerun. -- -- It is usually simpler to call either 'getDirectoryFiles' or 'getDirectoryDirs'. getDirectoryContents :: FilePath -> Action [FilePath] getDirectoryContents = fmap fromGetDirectoryA . apply1 . GetDirectoryContentsQ -- | Get the files anywhere under a directory that match any of a set of patterns. -- For the interpretation of the patterns see '?=='. All results will be -- relative to the directory argument. -- The result itself is tracked as a dependency, but the files in the result are not. -- If the list of files changes in subsequent builds any rule calling it will rerun. -- Some examples: -- -- > getDirectoryFiles "Config" ["//*.xml"] -- > -- All .xml files anywhere under the Config directory -- > -- If Config/foo/bar.xml exists it will return ["foo/bar.xml"] -- > getDirectoryFiles "Modules" ["*.hs","*.lhs"] -- > -- All .hs or .lhs in the Modules directory -- > -- If Modules/foo.hs and Modules/foo.lhs exist, it will return ["foo.hs","foo.lhs"] -- -- If you require a qualified file name it is often easier to use @\"\"@ as the 'FilePath' argument, -- for example the following two expressions are equivalent: -- -- > fmap (map ("Config" )) (getDirectoryFiles "Config" ["//*.xml"]) -- > getDirectoryFiles "" ["Config//*.xml"] -- -- If the first argument directory does not exist it will raise an error. -- If @foo@ does not exist, then the first of these error, but the second will not. -- -- > getDirectoryFiles "foo" ["//*"] -- error -- > getDirectoryFiles "" ["foo//*"] -- returns [] -- -- This function is tracked and serves as a dependency. If a rule calls -- @getDirectoryFiles \"\" [\"*.c\"]@ and someone adds @foo.c@ to the -- directory, that rule will rebuild. If someone changes one of the @.c@ files, -- but the /list/ of @.c@ files doesn't change, then it will not rebuild. -- As a consequence of being tracked, if the contents change during the build -- (e.g. you are generating @.c@ files in this directory) then the build not reach -- a stable point, which is an error - detected by running with @--lint@. -- You should normally only call this function returning source files. -- -- For an untracked variant see 'getDirectoryFilesIO'. getDirectoryFiles :: FilePath -> [FilePattern] -> Action [FilePath] getDirectoryFiles dir pat = fmap fromGetDirectoryA $ apply1 $ GetDirectoryFilesQ (dir,pat) -- | Get the directories in a directory, not including @.@ or @..@. -- All directories are relative to the argument directory. -- The result itself is tracked as a dependency, but the directories in the result are not. -- If the list of directories changes in subsequent builds any rule calling it will rerun. -- The rules about creating entries described in 'getDirectoryFiles' also apply here. -- -- > getDirectoryDirs "/Users" -- > -- Return all directories in the /Users directory -- > -- e.g. ["Emily","Henry","Neil"] getDirectoryDirs :: FilePath -> Action [FilePath] getDirectoryDirs = fmap fromGetDirectoryA . apply1 . GetDirectoryDirsQ --------------------------------------------------------------------- -- IO ROUTINES getDirectoryContentsIO :: FilePath -> IO [FilePath] -- getDirectoryContents "" is equivalent to getDirectoryContents "." on Windows, -- but raises an error on Linux. We smooth out the difference. getDirectoryContentsIO dir = fmap (sort . filter (not . all (== '.'))) $ IO.getDirectoryContents $ if dir == "" then "." else dir getDirectoryDirsIO :: FilePath -> IO [FilePath] getDirectoryDirsIO dir = filterM f =<< getDirectoryContentsIO dir where f x = IO.doesDirectoryExist $ dir x -- | A version of 'getDirectoryFiles' that is in IO, and thus untracked. getDirectoryFilesIO :: FilePath -> [FilePattern] -> IO [FilePath] -- Known infelicity: on Windows, if you search for "foo", but have the file "FOO", -- it will match if on its own, or not if it is paired with "*", since that forces -- a full directory scan, and then it uses Haskell equality (case sensitive) getDirectoryFilesIO root pat = f "" $ snd $ walk pat where -- Even after we know they are there because we called contents, we still have to check they are directories/files -- as required f dir (Walk op) = f dir . WalkTo . op =<< getDirectoryContentsIO (root dir) f dir (WalkTo (files, dirs)) = do files <- filterM (IO.doesFileExist . (root )) $ map (dir ) files dirs <- concatMapM (uncurry f) =<< filterM (IO.doesDirectoryExist . (root ) . fst) (map (first (dir )) dirs) return $ files ++ dirs --------------------------------------------------------------------- -- REMOVE UTILITIES -- | Remove all files and directories that match any of the patterns within a directory. -- Some examples: -- -- @ -- 'removeFiles' \"output\" [\"\/\/*\"] -- delete everything inside \'output\' -- 'removeFiles' \"output\" [\"\/\/\"] -- delete \'output\' itself -- 'removeFiles' \".\" [\"\/\/*.hi\",\"\/\/*.o\"] -- delete all \'.hi\' and \'.o\' files -- @ -- -- If the argument directory is missing no error is raised. -- This function will follow symlinks, so should be used with care. -- -- This function is often useful when writing a @clean@ action for your build system, -- often as a 'phony' rule. removeFiles :: FilePath -> [FilePattern] -> IO () removeFiles dir pat = whenM (IO.doesDirectoryExist dir) $ do let (b,w) = walk pat if b then removeDir dir else f dir w where f dir (Walk op) = f dir . WalkTo . op =<< getDirectoryContentsIO dir f dir (WalkTo (files, dirs)) = do forM_ files $ \fil -> tryIO $ removeItem $ dir fil let done = Set.fromList files forM_ (filter (not . flip Set.member done . fst) dirs) $ \(d,w) -> do let dir2 = dir d whenM (IO.doesDirectoryExist dir2) $ f dir2 w removeItem :: FilePath -> IO () removeItem x = IO.removeFile x `catchIO` \_ -> removeDir x -- In newer GHC's removeDirectoryRecursive is probably better, but doesn't follow -- symlinks, so it's got different behaviour removeDir :: FilePath -> IO () removeDir x = do mapM_ (removeItem . (x )) =<< getDirectoryContentsIO x IO.removeDirectory x -- | Remove files, like 'removeFiles', but executed after the build completes successfully. -- Useful for implementing @clean@ actions that delete files Shake may have open for building. removeFilesAfter :: FilePath -> [FilePattern] -> Action () removeFilesAfter a b = do putLoud $ "Will remove " ++ unwords b ++ " from " ++ a runAfter $ removeFiles a b shake-0.16.4/src/Development/Shake/Internal/Core/0000755000000000000000000000000013261223301017624 5ustar0000000000000000shake-0.16.4/src/Development/Shake/Internal/Core/Types.hs0000644000000000000000000001720613261223301021272 0ustar0000000000000000{-# LANGUAGE CPP #-} {-# LANGUAGE GeneralizedNewtypeDeriving, ScopedTypeVariables, DeriveDataTypeable #-} {-# LANGUAGE ExistentialQuantification, DeriveFunctor, RecordWildCards #-} module Development.Shake.Internal.Core.Types( BuiltinRun, BuiltinLint, RunResult(..), RunChanged(..), UserRule(..), UserRule_(..), BuiltinRule(..), Global(..), Local(..), Action(..), newLocal, localClearMutable, localMergeMutable ) where import Control.DeepSeq import Control.Monad.IO.Class import Control.Applicative import Data.Typeable import General.Binary import qualified Data.HashMap.Strict as Map import Data.IORef import qualified Data.ByteString as BS import System.Time.Extra import Development.Shake.Internal.Core.Pool import Development.Shake.Internal.Core.Database import Development.Shake.Internal.Core.Monad import Development.Shake.Internal.Value import Development.Shake.Internal.Options import General.Cleanup import Prelude #if __GLASGOW_HASKELL__ >= 800 import Control.Monad.Fail #endif --------------------------------------------------------------------- -- UNDERLYING DATA TYPE -- | The 'Action' monad, use 'liftIO' to raise 'IO' actions into it, and 'Development.Shake.need' to execute files. -- Action values are used by 'addUserRule' and 'action'. The 'Action' monad tracks the dependencies of a rule. -- To raise an exception call 'error', 'fail' or @'liftIO' . 'throwIO'@. newtype Action a = Action {fromAction :: RAW Global Local a} deriving (Functor, Applicative, Monad, MonadIO, Typeable #if __GLASGOW_HASKELL__ >= 800 ,MonadFail #endif ) -- | How has a rule changed. data RunChanged = ChangedNothing -- ^ Nothing has changed. | ChangedStore -- ^ The persisted value has changed, but in a way that should be considered identical. | ChangedRecomputeSame -- ^ I recomputed the value and it was the same. | ChangedRecomputeDiff -- ^ I recomputed the value and it was different. deriving (Eq,Show) instance NFData RunChanged where rnf x = x `seq` () -- | The result of 'BuiltinRun'. data RunResult value = RunResult {runChanged :: RunChanged -- ^ What has changed from the previous time. ,runStore :: BS.ByteString -- ^ Return the new value to store. Often a serialised version of 'runValue'. ,runValue :: value -- ^ Return the produced value. } deriving Functor instance NFData value => NFData (RunResult value) where rnf (RunResult x1 x2 x3) = rnf x1 `seq` x2 `seq` rnf x3 -- | Define a rule between @key@ and @value@. A rule for a class of artifacts (e.g. /files/) provides: -- -- * How to identify individual artifacts, given by the @key@ type, e.g. with file names. -- -- * How to describe the state of an artifact, given by the @value@ type, e.g. the file modification time. -- -- * How to persist the state of an artifact, using the 'ByteString' values, e.g. seralised @value@. -- -- The arguments comprise the @key@, the value of the previous serialisation or 'Nothing' if the rule -- has not been run previously, and 'True' to indicate the dependencies have changed or 'False' that -- they have not. type BuiltinRun key value = key -> Maybe BS.ByteString -> Bool -> Action (RunResult value) -- | The action performed by @--lint@ for a given @key@/@value@ pair. -- At the end of the build the lint action will be called for each @key@ that was built this run, -- passing the @value@ it produced. Return 'Nothing' to indicate the value has not changed and -- is acceptable, or 'Just' an error message to indicate failure. -- -- For builtin rules where the value is expected to change use 'Development.Shake.Rules.noLint'. type BuiltinLint key value = key -> value -> IO (Maybe String) data BuiltinRule = BuiltinRule {builtinRun :: BuiltinRun Key Value ,builtinLint :: BuiltinLint Key Value ,builtinResult :: TypeRep ,builtinKey :: BinaryOp Key } data UserRule_ = forall a . Typeable a => UserRule_ (UserRule a) -- | A 'UserRule' data type, representing user-defined rules associated with a particular type. -- As an example 'Development.Shake.?>' and 'Development.Shake.%>' will add entries to the 'UserRule' data type. data UserRule a -- > priority p1 (priority p2 x) == priority p1 x -- > priority p (x `ordered` y) = priority p x `ordered` priority p y -- > priority p (x `unordered` y) = priority p x `unordered` priority p y -- > ordered is associative -- > unordered is associative and commutative -- > alternative does not obey priorities, until picking the best one = UserRule a -- ^ Added to the state with @'addUserRule' :: Typeable a => a -> 'Rules' ()@. | Unordered [UserRule a] -- ^ Rules combined with the 'Monad' \/ 'Monoid'. | Priority Double (UserRule a) -- ^ Rules defined under 'priority'. | Alternative (UserRule a) -- ^ Rule defined under 'alternatives', matched in order. deriving (Eq,Show,Functor,Typeable) -- global constants of Action data Global = Global {globalDatabase :: Database -- ^ Database, contains knowledge of the state of each key ,globalPool :: Pool -- ^ Pool, for queuing new elements ,globalCleanup :: Cleanup -- ^ Cleanup operations ,globalTimestamp :: IO Seconds -- ^ Clock saying how many seconds through the build ,globalRules :: Map.HashMap TypeRep BuiltinRule -- ^ Rules for this build ,globalOutput :: Verbosity -> String -> IO () -- ^ Output function ,globalOptions :: ShakeOptions -- ^ Shake options ,globalDiagnostic :: IO String -> IO () -- ^ Debugging function ,globalCurDir :: FilePath -- ^ getCurrentDirectory when we started ,globalAfter :: IORef [IO ()] -- ^ Operations to run on success, e.g. removeFilesAfter ,globalTrackAbsent :: IORef [(Key, Key)] -- ^ Tracked things, in rule fst, snd must be absent ,globalProgress :: IO Progress -- ^ Request current progress state ,globalUserRules :: Map.HashMap TypeRep UserRule_ } -- local variables of Action data Local = Local -- constants {localStack :: Stack -- ^ The stack that ran to get here. -- stack scoped local variables ,localVerbosity :: Verbosity -- ^ Verbosity, may be changed locally ,localBlockApply :: Maybe String -- ^ Reason to block apply, or Nothing to allow -- mutable local variables ,localDepends :: [Depends] -- ^ Dependencies, built up in reverse ,localDiscount :: !Seconds -- ^ Time spend building dependencies ,localTraces :: [Trace] -- ^ Traces, built in reverse ,localTrackAllows :: [Key -> Bool] -- ^ Things that are allowed to be used ,localTrackUsed :: [Key] -- ^ Things that have been used } newLocal :: Stack -> Verbosity -> Local newLocal stack verb = Local stack verb Nothing [] 0 [] [] [] -- Clear all the local mutable variables localClearMutable :: Local -> Local localClearMutable Local{..} = (newLocal localStack localVerbosity){localBlockApply=localBlockApply} -- Merge, works well assuming you clear the variables first localMergeMutable :: Local -> [Local] -> Local -- don't construct with RecordWildCards so any new fields raise an error localMergeMutable root xs = Local -- immutable/stack that need copying {localStack = localStack root ,localVerbosity = localVerbosity root ,localBlockApply = localBlockApply root -- mutable locals that need integrating -- note that a lot of the lists are stored in reverse, assume root happened first ,localDepends = concatMap localDepends xs ++ localDepends root ,localDiscount = localDiscount root + maximum (0:map localDiscount xs) ,localTraces = concatMap localTraces xs ++ localTraces root ,localTrackAllows = localTrackAllows root ++ concatMap localTrackAllows xs ,localTrackUsed = localTrackUsed root ++ concatMap localTrackUsed xs } shake-0.16.4/src/Development/Shake/Internal/Core/Storage.hs0000644000000000000000000002251613261223301021572 0ustar0000000000000000{-# LANGUAGE ScopedTypeVariables, RecordWildCards, FlexibleInstances #-} {-# LANGUAGE BangPatterns #-} {- This module stores the meta-data so its very important its always accurate We can't rely on getting any exceptions or termination at the end, so we'd better write out a journal We store a series of records, and if they contain twice as many records as needed, we compress -} module Development.Shake.Internal.Core.Storage( withStorage ) where import General.Chunks import General.Binary import General.Intern import Development.Shake.Internal.Options import General.Timing import General.FileLock import qualified General.Ids as Ids import Control.Exception.Extra import Control.Monad.Extra import Data.Monoid import Data.Either.Extra import Data.Time import Data.Char import Data.Word import Development.Shake.Classes import Numeric import General.Extra import Data.List.Extra import Data.Maybe import System.FilePath import qualified Data.ByteString.UTF8 as UTF8 import qualified Data.HashMap.Strict as Map import qualified Data.ByteString.Char8 as BS import qualified Data.ByteString as BS8 import Data.Functor import Prelude -- Increment every time the on-disk format/semantics change, -- @x@ is for the users version number databaseVersion :: String -> String -- THINGS I WANT TO DO ON THE NEXT CHANGE -- * Change filepaths to store a 1 byte prefix saying 8bit ASCII or UTF8 -- * Duration and Time should be stored as number of 1/10000th seconds Int32 databaseVersion x = "SHAKE-DATABASE-13-" ++ s ++ "\r\n" where s = tail $ init $ show x -- call show, then take off the leading/trailing quotes -- ensures we do not get \r or \n in the user portion -- | Storage of heterogeneous things. In the particular case of Shake, -- k ~ TypeRep, v ~ (Key, Status{Value}). -- -- The storage starts with a witness table saying what can be contained. -- If any entries in the witness table don't have a current Witness then a fake -- error witness is manufactured. If the witness ever changes the entire DB is -- rewritten. withStorage :: (Show k, Eq k, Hashable k, NFData k, Show v, NFData v) => ShakeOptions -- ^ Storage options -> (IO String -> IO ()) -- ^ Logging function -> Map.HashMap k (BinaryOp v) -- ^ Witnesses -> (Ids.Ids v -> (k -> Id -> v -> IO ()) -> IO a) -- ^ Execute -> IO a withStorage ShakeOptions{..} diagnostic witness act = withLockFileDiagnostic diagnostic (shakeFiles ".shake.lock") $ do let dbfile = shakeFiles ".shake.database" createDirectoryRecursive shakeFiles -- complete a partially failed compress whenM (restoreChunksBackup dbfile) $ do unexpected "Backup file exists, restoring over the previous file\n" diagnostic $ return "Backup file move to original" addTiming "Database read" withChunks dbfile shakeFlush $ \h -> do let corrupt | not shakeStorageLog = resetChunksCorrupt Nothing h | otherwise = do let file = dbfile <.> "corrupt" resetChunksCorrupt (Just file) h unexpected $ "Backup of corrupted file stored at " ++ file ++ "\n" -- check the version information matches let ver = BS.pack $ databaseVersion shakeVersion oldVer <- readChunkMax h $ fromIntegral $ BS.length ver + 100000 let verEq = Right ver == oldVer when (not shakeVersionIgnore && not verEq && oldVer /= Left BS.empty) $ do let limit x = let (a,b) = splitAt 200 x in a ++ (if null b then "" else "...") let disp = map (\x -> if isPrint x && isAscii x then x else '?') . takeWhile (`notElem` "\r\n") outputErr $ unlines ["Error when reading Shake database - invalid version stamp detected:" ," File: " ++ dbfile ," Expected: " ++ disp (BS.unpack ver) ," Found: " ++ disp (limit $ BS.unpack $ fromEither oldVer) ,"All rules will be rebuilt"] corrupt let (witnessNew, save) = putWitness witness evaluate save witnessOld <- readChunk h ids <- case witnessOld of Left _ -> do resetChunksCorrupt Nothing h return Nothing Right witnessOld -> handleBool (not . isAsyncException) (\err -> do msg <- showException err outputErr $ unlines $ ("Error when reading Shake database " ++ dbfile) : map (" "++) (lines msg) ++ ["All files will be rebuilt"] corrupt return Nothing) $ do let load = getWitness witnessOld witness evaluate load ids <- Ids.empty let go !i = do v <- readChunk h case v of Left e -> do let slop = fromIntegral $ BS.length e when (slop > 0) $ unexpected $ "Last " ++ show slop ++ " bytes do not form a whole record\n" diagnostic $ return $ "Read " ++ show i ++ " chunks, plus " ++ show slop ++ " slop" return i Right bs -> do let (k,id,v) = load bs evaluate $ rnf k evaluate $ rnf v Ids.insert ids id (k,v) diagnostic $ do let raw x = "[len " ++ show (BS.length bs) ++ "] " ++ concat [['0' | length c == 1] ++ c | x <- BS8.unpack bs, let c = showHex x ""] let pretty (Left x) = "FAILURE: " ++ show x pretty (Right x) = x x2 <- try_ $ evaluate $ let s = show v in rnf s `seq` s return $ "Chunk " ++ show i ++ " " ++ raw bs ++ " " ++ show id ++ " = " ++ pretty x2 go $ i+1 countItems <- go 0 countDistinct <- Ids.sizeUpperBound ids diagnostic $ return $ "Found at most " ++ show countDistinct ++ " distinct entries out of " ++ show countItems when (countItems > countDistinct*2 || not verEq || witnessOld /= witnessNew) $ do addTiming "Database compression" resetChunksCompact h $ \out -> do out $ putEx ver out $ putEx witnessNew Ids.forWithKeyM_ ids $ \i (k,v) -> out $ save k i v Just <$> Ids.for ids snd ids <- case ids of Just ids -> return ids Nothing -> do writeChunk h $ putEx ver writeChunk h $ putEx witnessNew Ids.empty addTiming "With database" writeChunks h $ \out -> act ids $ \k i v -> out $ save k i v where unexpected x = when shakeStorageLog $ do t <- getCurrentTime appendFile (shakeFiles ".shake.storage.log") $ "\n[" ++ show t ++ "]: " ++ trimEnd x ++ "\n" outputErr x = do when (shakeVerbosity >= Quiet) $ shakeOutput Quiet x unexpected x keyName :: Show k => k -> BS.ByteString keyName = UTF8.fromString . show getWitness :: Show k => BS.ByteString -> Map.HashMap k (BinaryOp v) -> (BS.ByteString -> (k, Id, v)) getWitness bs mp | length ws > limit || Map.size mp > limit = error "Number of distinct witness types exceeds limit" | otherwise = ind `seq` mp2 `seq` \bs -> let (k :: Word16,bs2) = binarySplit bs in case ind (fromIntegral k) of Nothing -> error $ "Witness type out of bounds, " ++ show k Just f -> f bs2 where limit = fromIntegral (maxBound :: Word16) ws :: [BS.ByteString] = getEx bs mp2 = Map.fromList [(keyName k, (k, v)) | (k,v) <- Map.toList mp] ind = fastAt [ case Map.lookup w mp2 of Nothing -> error $ "Witness type has disappeared, " ++ UTF8.toString w Just (k, BinaryOp{..}) -> \bs -> let (i, bs2) = binarySplit bs v = getOp bs2 in (k, i, v) | w <- ws] putWitness :: (Eq k, Hashable k, Show k) => Map.HashMap k (BinaryOp v) -> (BS.ByteString, k -> Id -> v -> Builder) putWitness mp = (runBuilder $ putEx (ws :: [BS.ByteString]), mp2 `seq` \k -> fromMaybe (error $ "Don't know how to save, " ++ show k) $ Map.lookup k mp2) where ws = sort $ map keyName $ Map.keys mp wsMp = Map.fromList $ zip ws [0 :: Word16 ..] mp2 = Map.mapWithKey (\k BinaryOp{..} -> let tag = putEx $ wsMp Map.! keyName k in \(Id w) v -> tag <> putEx w <> putOp v) mp withLockFileDiagnostic :: (IO String -> IO ()) -> FilePath -> IO a -> IO a withLockFileDiagnostic diagnostic file act = do diagnostic $ return $ "Before withLockFile on " ++ file res <- withLockFile file $ do diagnostic $ return "Inside withLockFile" act diagnostic $ return "After withLockFile" return res shake-0.16.4/src/Development/Shake/Internal/Core/Run.hs0000644000000000000000000005222513261223301020732 0ustar0000000000000000{-# LANGUAGE RecordWildCards, NamedFieldPuns, ScopedTypeVariables, PatternGuards #-} {-# LANGUAGE ConstraintKinds #-} {-# LANGUAGE TypeFamilies #-} module Development.Shake.Internal.Core.Run( run, Action, actionOnException, actionFinally, apply, apply1, traced, getDatabaseValue, getShakeOptions, getProgress, getVerbosity, putLoud, putNormal, putQuiet, withVerbosity, quietly, Resource, newResourceIO, withResource, newThrottleIO, newCacheIO, unsafeExtraThread, unsafeAllowApply, parallel, orderOnlyAction, batch, runAfter ) where import Control.Exception import Control.Applicative import Data.Tuple.Extra import Control.Concurrent.Extra import Control.Monad.Extra import Control.Monad.IO.Class import Data.Typeable.Extra import Data.Function import Data.Either.Extra import Data.List.Extra import qualified Data.HashMap.Strict as Map import Data.Dynamic import Data.Maybe import Data.IORef import System.Directory import System.IO.Extra import System.Time.Extra import Numeric.Extra import qualified Data.ByteString as BS import Development.Shake.Classes import Development.Shake.Internal.Core.Types import Development.Shake.Internal.Core.Action import Development.Shake.Internal.Core.Rules import Development.Shake.Internal.Core.Pool import Development.Shake.Internal.Core.Database import Development.Shake.Internal.Core.Monad import Development.Shake.Internal.Resource import Development.Shake.Internal.Value import Development.Shake.Internal.Profile import Development.Shake.Internal.Options import Development.Shake.Internal.Errors import General.Timing import General.Extra import General.Concurrent import General.Cleanup import Prelude --------------------------------------------------------------------- -- MAKE -- | Internal main function (not exported publicly) run :: ShakeOptions -> Rules () -> IO () run opts@ShakeOptions{..} rs = (if shakeLineBuffering then withLineBuffering else id) $ do opts@ShakeOptions{..} <- if shakeThreads /= 0 then return opts else do p <- getProcessorCount; return opts{shakeThreads=p} start <- offsetTime (actions, ruleinfo, userRules) <- runRules opts rs outputLocked <- do lock <- newLock return $ \v msg -> withLock lock $ shakeOutput v msg let diagnostic | shakeVerbosity < Diagnostic = const $ return () | otherwise = \act -> do v <- act; outputLocked Diagnostic $ "% " ++ v let output v = outputLocked v . shakeAbbreviationsApply opts diagnostic $ return "Starting run" except <- newIORef (Nothing :: Maybe (String, ShakeException)) let raiseError err | not shakeStaunch = throwIO err | otherwise = do let named = shakeAbbreviationsApply opts . shakeExceptionTarget atomicModifyIORef except $ \v -> (Just $ fromMaybe (named err, err) v, ()) -- no need to print exceptions here, they get printed when they are wrapped curdir <- getCurrentDirectory diagnostic $ return "Starting run 2" checkShakeExtra shakeExtra after <- newIORef [] absent <- newIORef [] withCleanup $ \cleanup -> do addCleanup_ cleanup $ do when (shakeTimings && shakeVerbosity >= Normal) printTimings resetTimings -- so we don't leak memory withNumCapabilities shakeThreads $ do diagnostic $ return "Starting run 3" withDatabase opts diagnostic (Map.map builtinKey ruleinfo) $ \database -> do wait <- newBarrier let getProgress = do failure <- fmap fst <$> readIORef except stats <- progress database return stats{isFailure=failure} tid <- flip forkFinally (const $ signalBarrier wait ()) $ shakeProgress getProgress addCleanup_ cleanup $ do killThread tid void $ timeout 1 $ waitBarrier wait addTiming "Running rules" runPool (shakeThreads == 1) shakeThreads $ \pool -> do let s0 = Global database pool cleanup start ruleinfo output opts diagnostic curdir after absent getProgress userRules let s1 = newLocal emptyStack shakeVerbosity forM_ actions $ \act -> addPoolStart pool $ runAction s0 s1 act $ \x -> case x of Left e -> raiseError =<< shakeException s0 ["Top-level action/want"] e Right x -> return x maybe (return ()) (throwIO . snd) =<< readIORef except assertFinishedDatabase database let putWhen lvl msg = when (shakeVerbosity >= lvl) $ output lvl msg when (null actions) $ putWhen Normal "Warning: No want/action statements, nothing to do" when (isJust shakeLint) $ do addTiming "Lint checking" lintCurrentDirectory curdir "After completion" absent <- readIORef absent checkValid database (runLint ruleinfo) absent putWhen Loud "Lint checking succeeded" when (shakeReport /= []) $ do addTiming "Profile report" report <- toReport database forM_ shakeReport $ \file -> do putWhen Normal $ "Writing report to " ++ file writeProfile file report when (shakeLiveFiles /= []) $ do addTiming "Listing live" live <- listLive database let specialIsFileKey t = show (fst $ splitTyConApp t) == "FileQ" let liveFiles = [show k | k <- live, specialIsFileKey $ typeKey k] forM_ shakeLiveFiles $ \file -> do putWhen Normal $ "Writing live list to " ++ file (if file == "-" then putStr else writeFile file) $ unlines liveFiles after <- readIORef after unless (null after) $ do addTiming "Running runAfter" sequence_ $ reverse after checkShakeExtra :: Map.HashMap TypeRep Dynamic -> IO () checkShakeExtra mp = do let bad = [(k,t) | (k,v) <- Map.toList mp, let t = dynTypeRep v, t /= k] case bad of (k,t):xs -> errorStructured "Invalid Map in shakeExtra" [("Key",Just $ show k),("Value type",Just $ show t)] (if null xs then "" else "Plus " ++ show (length xs) ++ " other keys") _ -> return () lintCurrentDirectory :: FilePath -> String -> IO () lintCurrentDirectory old msg = do now <- getCurrentDirectory when (old /= now) $ errorStructured "Lint checking error - current directory has changed" [("When", Just msg) ,("Wanted",Just old) ,("Got",Just now)] "" withLineBuffering :: IO a -> IO a withLineBuffering act = do -- instead of withBuffering avoid two finally handlers and stack depth out <- hGetBuffering stdout err <- hGetBuffering stderr if out == LineBuffering && err == LineBuffering then act else do hSetBuffering stdout LineBuffering hSetBuffering stderr LineBuffering act `finally` do hSetBuffering stdout out hSetBuffering stderr err getDatabaseValue :: (RuleResult key ~ value, ShakeValue key, Typeable value) => key -> Action (Maybe (Either BS.ByteString value)) getDatabaseValue k = do global@Global{..} <- Action getRO liftIO $ fmap (fmap $ fmap fromValue) $ lookupStatus globalDatabase $ newKey k -- | Execute a rule, returning the associated values. If possible, the rules will be run in parallel. -- This function requires that appropriate rules have been added with 'addUserRule'. -- All @key@ values passed to 'apply' become dependencies of the 'Action'. apply :: (RuleResult key ~ value, ShakeValue key, Typeable value) => [key] -> Action [value] -- Don't short-circuit [] as we still want error messages apply (ks :: [key]) = withResultType $ \(p :: Maybe (Action [value])) -> do -- this is the only place a user can inject a key into our world, so check they aren't throwing -- in unevaluated bottoms liftIO $ mapM_ (evaluate . rnf) ks let tk = typeRep (Proxy :: Proxy key) tv = typeRep (Proxy :: Proxy value) Global{..} <- Action getRO Local{localBlockApply} <- Action getRW whenJust localBlockApply $ liftIO . errorNoApply tk (show <$> listToMaybe ks) case Map.lookup tk globalRules of Nothing -> liftIO $ errorNoRuleToBuildType tk (show <$> listToMaybe ks) (Just tv) Just BuiltinRule{builtinResult=tv2} | tv /= tv2 -> errorInternal $ "result type does not match, " ++ show tv ++ " vs " ++ show tv2 _ -> fmap (map fromValue) $ applyKeyValue $ map newKey ks applyKeyValue :: [Key] -> Action [Value] applyKeyValue [] = return [] applyKeyValue ks = do global@Global{..} <- Action getRO Local{localStack} <- Action getRW (dur, dep, vs) <- Action $ captureRAW $ build globalPool globalDatabase (BuildKey $ runKey global) localStack ks Action $ modifyRW $ \s -> s{localDiscount=localDiscount s + dur, localDepends=dep : localDepends s} return vs runKey :: Global -> Stack -> Step -> Key -> Maybe (Result BS.ByteString) -> Bool -> Capture (Either SomeException (Bool, BS.ByteString, Result Value)) runKey global@Global{globalOptions=ShakeOptions{..},..} stack step k r dirtyChildren continue = do let tk = typeKey k BuiltinRule{..} <- case Map.lookup tk globalRules of Nothing -> errorNoRuleToBuildType tk (Just $ show k) Nothing Just r -> return r let s = newLocal stack shakeVerbosity time <- offsetTime runAction global s (do res <- builtinRun k (fmap result r) dirtyChildren liftIO $ evaluate $ rnf res when (Just LintFSATrace == shakeLint) trackCheckUsed Action $ fmap ((,) res) getRW) $ \x -> case x of Left e -> do e <- if isNothing shakeLint then return e else handle return $ do lintCurrentDirectory globalCurDir $ "Running " ++ show k; return e continue . Left . toException =<< shakeException global (showStack stack) e Right (RunResult{..}, Local{..}) | runChanged == ChangedNothing || runChanged == ChangedStore, Just r <- r -> continue $ Right (runChanged == ChangedStore, runStore, r{result = runValue}) | otherwise -> do dur <- time let c | Just r <- r, runChanged == ChangedRecomputeSame = changed r | otherwise = step continue $ Right $ (,,) True runStore Result {result = runValue ,changed = c ,built = step ,depends = nubDepends $ reverse localDepends ,execution = doubleToFloat $ dur - localDiscount ,traces = reverse localTraces} runLint :: Map.HashMap TypeRep BuiltinRule -> Key -> Value -> IO (Maybe String) runLint mp k v = case Map.lookup (typeKey k) mp of Nothing -> return Nothing Just BuiltinRule{..} -> builtinLint k v -- | Turn a normal exception into a ShakeException, giving it a stack and printing it out if in staunch mode. -- If the exception is already a ShakeException (e.g. it's a child of ours who failed and we are rethrowing) -- then do nothing with it. shakeException :: Global -> [String] -> SomeException -> IO ShakeException shakeException Global{globalOptions=ShakeOptions{..},..} stk e@(SomeException inner) = case cast inner of Just e@ShakeException{} -> return e Nothing -> do e <- return $ ShakeException (last $ "Unknown call stack" : stk) stk e when (shakeStaunch && shakeVerbosity >= Quiet) $ globalOutput Quiet $ show e ++ "Continuing due to staunch mode" return e -- | Apply a single rule, equivalent to calling 'apply' with a singleton list. Where possible, -- use 'apply' to allow parallelism. apply1 :: (RuleResult key ~ value, ShakeValue key, Typeable value) => key -> Action value apply1 = fmap head . apply . return --------------------------------------------------------------------- -- RESOURCES -- | Run an action which uses part of a finite resource. For more details see 'Resource'. -- You cannot depend on a rule (e.g. 'need') while a resource is held. withResource :: Resource -> Int -> Action a -> Action a withResource r i act = do Global{..} <- Action getRO liftIO $ globalDiagnostic $ return $ show r ++ " waiting to acquire " ++ show i offset <- liftIO offsetTime Action $ captureRAW $ \continue -> acquireResource r globalPool i $ continue $ Right () res <- Action $ tryRAW $ fromAction $ blockApply ("Within withResource using " ++ show r) $ do offset <- liftIO offset liftIO $ globalDiagnostic $ return $ show r ++ " acquired " ++ show i ++ " in " ++ showDuration offset Action $ modifyRW $ \s -> s{localDiscount = localDiscount s + offset} act liftIO $ releaseResource r globalPool i liftIO $ globalDiagnostic $ return $ show r ++ " released " ++ show i Action $ either throwRAW return res -- | A version of 'Development.Shake.newCache' that runs in IO, and can be called before calling 'Development.Shake.shake'. -- Most people should use 'Development.Shake.newCache' instead. newCacheIO :: (Eq k, Hashable k) => (k -> Action v) -> IO (k -> Action v) newCacheIO (act :: k -> Action v) = do var :: Var (Map.HashMap k (Fence (Either SomeException ([Depends],v)))) <- newVar Map.empty return $ \key -> join $ liftIO $ modifyVar var $ \mp -> case Map.lookup key mp of Just bar -> return $ (,) mp $ do res <- liftIO $ testFence bar (res,offset) <- case res of Just res -> return (res, 0) Nothing -> do Global{..} <- Action getRO offset <- liftIO offsetTime Action $ captureRAW $ \k -> waitFence bar $ \v -> addPoolResume globalPool $ do offset <- liftIO offset; k $ Right (v,offset) case res of Left err -> Action $ throwRAW err Right (deps,v) -> do Action $ modifyRW $ \s -> s{localDepends = deps ++ localDepends s, localDiscount = localDiscount s + offset} return v Nothing -> do bar <- newFence return $ (,) (Map.insert key bar mp) $ do Local{localDepends=pre} <- Action getRW res <- Action $ tryRAW $ fromAction $ act key case res of Left err -> do liftIO $ signalFence bar $ Left err Action $ throwRAW err Right v -> do Local{localDepends=post} <- Action getRW let deps = take (length post - length pre) post liftIO $ signalFence bar $ Right (deps, v) return v -- | Run an action without counting to the thread limit, typically used for actions that execute -- on remote machines using barely any local CPU resources. -- Unsafe as it allows the 'shakeThreads' limit to be exceeded. -- You cannot depend on a rule (e.g. 'need') while the extra thread is executing. -- If the rule blocks (e.g. calls 'withResource') then the extra thread may be used by some other action. -- Only really suitable for calling 'cmd' / 'command'. unsafeExtraThread :: Action a -> Action a unsafeExtraThread act = Action $ do Global{..} <- getRO stop <- liftIO $ increasePool globalPool res <- tryRAW $ fromAction $ blockApply "Within unsafeExtraThread" act liftIO stop captureRAW $ \continue -> (if isLeft res then addPoolException else addPoolResume) globalPool $ continue res -- | Execute a list of actions in parallel. In most cases 'need' will be more appropriate to benefit from parallelism. parallel :: [Action a] -> Action [a] -- Note: There is no parallel_ unlike sequence_ because there is no stack benefit to doing so parallel [] = return [] parallel [x] = fmap return x parallel acts = Action $ do global@Global{..} <- getRO local <- getRW -- number of items still to complete, or Nothing for has completed (by either failure or completion) todo :: Var (Maybe Int) <- liftIO $ newVar $ Just $ length acts -- a list of refs where the results go results :: [IORef (Maybe (Either SomeException (Local, a)))] <- liftIO $ replicateM (length acts) $ newIORef Nothing (locals, results) <- captureRAW $ \continue -> do let resume = do res <- liftIO $ sequence . catMaybes <$> mapM readIORef results continue $ fmap unzip res liftIO $ forM_ (zip acts results) $ \(act, result) -> do let act2 = do whenM (liftIO $ isNothing <$> readVar todo) $ fail "parallel, one has already failed" res <- act old <- Action getRW return (old, res) addPoolResume globalPool $ runAction global (localClearMutable local) act2 $ \res -> do writeIORef result $ Just res modifyVar_ todo $ \v -> case v of Nothing -> return Nothing Just i | i == 1 || isLeft res -> do resume; return Nothing Just i -> return $ Just $ i - 1 modifyRW $ \root -> localMergeMutable root locals return results -- | Run an action but do not depend on anything the action uses. -- A more general version of 'orderOnly'. orderOnlyAction :: Action a -> Action a orderOnlyAction act = Action $ do Local{localDepends=pre} <- getRW res <- fromAction act modifyRW $ \s -> s{localDepends=pre} return res -- | Batch different outputs into a single 'Action', typically useful when a command has a high -- startup cost - e.g. @apt-get install foo bar baz@ is a lot cheaper than three separate -- calls to @apt-get install@. As an example, if we have a standard build rule: -- -- @ -- \"*.out\" 'Development.Shake.%>' \\out -> do -- 'Development.Shake.need' [out '-<.>' \"in\"] -- 'Development.Shake.cmd' "build-multiple" [out '-<.>' \"in\"] -- @ -- -- Assuming that @build-multiple@ can compile multiple files in a single run, -- and that the cost of doing so is a lot less than running each individually, -- we can write: -- -- @ -- 'batch' 3 (\"*.out\" 'Development.Shake.%>') -- (\\out -> do 'Development.Shake.need' [out '-<.>' \"in\"]; return out) -- (\\outs -> 'Development.Shake.cmd' "build-multiple" [out '-<.>' \"in\" | out \<- outs]) -- @ -- -- In constrast to the normal call, we have specified a maximum batch size of 3, -- an action to run on each output individually (typically all the 'need' dependencies), -- and an action that runs on multiple files at once. If we were to require lots of -- @*.out@ files, they would typically be built in batches of 3. -- -- If Shake ever has nothing else to do it will run batches before they are at the maximum, -- so you may see much smaller batches, especially at high parallelism settings. batch :: Int -> ((a -> Action ()) -> Rules ()) -> (a -> Action b) -> ([b] -> Action ()) -> Rules () batch mx pred one many | mx <= 0 = error $ "Can't call batchable with <= 0, you used " ++ show mx | mx == 1 = pred $ \a -> do b <- one a; many [b] | otherwise = do todo :: IORef (Int, [(b, Either SomeException Local -> IO ())]) <- liftIO $ newIORef (0, []) pred $ \a -> Action $ do b <- fromAction $ one a -- optimisation would be to avoid taking the continuation if count >= mx -- but it only saves one pool requeue per mx, which is likely to be trivial -- and the code becomes a lot more special cases global@Global{..} <- getRO local <- getRW local2 <- captureRAW $ \k -> do count <- atomicModifyIORef todo $ \(count, bs) -> ((count+1, (b,k):bs), count+1) -- only trigger on the edge so we don't have lots of waiting pool entries (if count == mx then addPoolResume else if count == 1 then addPoolBatch else none) globalPool $ go global (localClearMutable local) todo modifyRW $ \root -> localMergeMutable root [local2] where none _ _ = return () go global@Global{..} local todo = do (now, count) <- atomicModifyIORef todo $ \(count, bs) -> if count <= mx then ((0, []), (bs, 0)) else let (xs,ys) = splitAt mx bs in ((count - mx, ys), (xs, count - mx)) (if count >= mx then addPoolResume else if count > 0 then addPoolBatch else none) globalPool $ go global local todo unless (null now) $ runAction global local (do many $ map fst now; Action getRW) $ \x -> forM_ now $ \(_,k) -> (if isLeft x then addPoolException else addPoolResume) globalPool $ k x shake-0.16.4/src/Development/Shake/Internal/Core/Rules.hs0000644000000000000000000002207713261223301021262 0ustar0000000000000000{-# LANGUAGE RecordWildCards, ScopedTypeVariables #-} {-# LANGUAGE GeneralizedNewtypeDeriving, ConstraintKinds #-} {-# LANGUAGE ExistentialQuantification, RankNTypes #-} {-# LANGUAGE TypeFamilies #-} module Development.Shake.Internal.Core.Rules( Rules, runRules, RuleResult, addBuiltinRule, addBuiltinRuleEx, noLint, getShakeOptionsRules, userRuleMatch, getUserRules, addUserRule, alternatives, priority, action, withoutActions ) where import Control.Applicative import Data.Tuple.Extra import Control.Monad.Extra import Control.Monad.Fix import Control.Monad.IO.Class import Control.Monad.Trans.Class import Control.Monad.Trans.Reader import Control.Monad.Trans.Writer.Strict import Development.Shake.Classes import General.Binary import Data.Typeable.Extra import Data.Function import Data.List.Extra import qualified Data.HashMap.Strict as Map import Data.Maybe import System.IO.Extra import System.IO.Unsafe import Data.Semigroup (Semigroup (..)) import Data.Monoid hiding ((<>)) import qualified Data.ByteString.Lazy as LBS import qualified Data.Binary.Builder as Bin import Data.Binary.Put import Data.Binary.Get import General.ListBuilder import Development.Shake.Internal.Core.Types import Development.Shake.Internal.Core.Monad import Development.Shake.Internal.Value import Development.Shake.Internal.Options import Development.Shake.Internal.Errors import Prelude --------------------------------------------------------------------- -- RULES -- | Get the 'UserRule' value at a given type. This 'UserRule' will capture -- all rules added, along with things such as 'priority' and 'alternatives'. getUserRules :: Typeable a => Action (UserRule a) getUserRules = f where f :: forall a . Typeable a => Action (UserRule a) f = do Global{..} <- Action getRO return $ case Map.lookup (typeRep (Proxy :: Proxy a)) globalUserRules of Nothing -> Unordered [] Just (UserRule_ r) -> fromJust $ cast r -- | Get the 'ShakeOptions' that were used. getShakeOptionsRules :: Rules ShakeOptions getShakeOptionsRules = Rules $ lift ask -- | Give a 'UserRule', and a function that tests a given rule, return the most important values -- that match. In most cases the caller will raise an error if the rule matching returns anything -- other than a singleton. userRuleMatch :: UserRule a -> (a -> Maybe b) -> [b] userRuleMatch u test = head $ (map snd $ reverse $ groupSort $ f Nothing $ fmap test u) ++ [[]] where f :: Maybe Double -> UserRule (Maybe a) -> [(Double,a)] f p (UserRule x) = maybe [] (\x -> [(fromMaybe 1 p,x)]) x f p (Unordered xs) = concatMap (f p) xs f p (Priority p2 x) = f (Just $ fromMaybe p2 p) x f p (Alternative x) = case f p x of [] -> [] -- a bit weird to use the max priority but the first value -- but that's what the current implementation does... xs -> [(maximum $ map fst xs, snd $ head xs)] -- | Define a set of rules. Rules can be created with calls to functions such as 'Development.Shake.%>' or 'action'. -- Rules are combined with either the 'Monoid' instance, or (more commonly) the 'Monad' instance and @do@ notation. -- To define your own custom types of rule, see "Development.Shake.Rule". newtype Rules a = Rules (WriterT SRules (ReaderT ShakeOptions IO) a) -- All IO must be associative/commutative (e.g. creating IORef/MVars) deriving (Functor, Applicative, Monad, MonadIO, MonadFix) newRules :: SRules -> Rules () newRules = Rules . tell modifyRules :: (SRules -> SRules) -> Rules () -> Rules () modifyRules f (Rules r) = Rules $ censor f r runRules :: ShakeOptions -> Rules () -> IO ([Action ()], Map.HashMap TypeRep BuiltinRule, Map.HashMap TypeRep UserRule_) runRules opts (Rules r) = do SRules{..} <- runReaderT (execWriterT r) opts return (runListBuilder actions, builtinRules, userRules) data SRules = SRules {actions :: !(ListBuilder (Action ())) ,builtinRules :: !(Map.HashMap TypeRep{-k-} BuiltinRule) ,userRules :: !(Map.HashMap TypeRep{-k-} UserRule_) } instance Semigroup SRules where (SRules x1 x2 x3) <> (SRules y1 y2 y3) = SRules (mappend x1 y1) (Map.unionWithKey f x2 y2) (Map.unionWith g x3 y3) where f k _ _ = unsafePerformIO $ errorRuleDefinedMultipleTimes k g (UserRule_ x) (UserRule_ y) = UserRule_ $ Unordered $ fromUnordered x ++ fromUnordered (fromJust $ cast y) fromUnordered (Unordered xs) = xs fromUnordered x = [x] instance Monoid SRules where mempty = SRules mempty Map.empty Map.empty mappend = (<>) instance Semigroup a => Semigroup (Rules a) where (<>) = liftA2 (<>) instance (Semigroup a, Monoid a) => Monoid (Rules a) where mempty = return mempty mappend = (<>) -- | Add a value of type 'UserRule'. addUserRule :: Typeable a => a -> Rules () addUserRule r = newRules mempty{userRules = Map.singleton (typeOf r) $ UserRule_ $ UserRule r} -- | A suitable 'BuiltinLint' that always succeeds. noLint :: BuiltinLint key value noLint _ _ = return Nothing type family RuleResult key -- = value -- | Add a builtin rule, comprising of a lint rule and an action. Each builtin rule must be identified by -- a unique key. addBuiltinRule :: (RuleResult key ~ value, ShakeValue key, ShakeValue value) => BuiltinLint key value -> BuiltinRun key value -> Rules () addBuiltinRule = addBuiltinRuleInternal $ BinaryOp (putEx . Bin.toLazyByteString . execPut . put) (runGet get . LBS.fromChunks . return) addBuiltinRuleEx :: (RuleResult key ~ value, ShakeValue key, BinaryEx key, Typeable value, NFData value, Show value) => BuiltinLint key value -> BuiltinRun key value -> Rules () addBuiltinRuleEx = addBuiltinRuleInternal $ BinaryOp putEx getEx -- | Unexpected version of 'addBuiltinRule', which also lets me set the 'BinaryOp'. addBuiltinRuleInternal :: (RuleResult key ~ value, ShakeValue key, Typeable value, NFData value, Show value) => BinaryOp key -> BuiltinLint key value -> BuiltinRun key value -> Rules () addBuiltinRuleInternal binary lint (run :: BuiltinRun key value) = do let k = Proxy :: Proxy key v = Proxy :: Proxy value let run_ k v b = fmap newValue <$> run (fromKey k) v b let lint_ k v = lint (fromKey k) (fromValue v) let binary_ = BinaryOp (putOp binary . fromKey) (newKey . getOp binary) newRules mempty{builtinRules = Map.singleton (typeRep k) $ BuiltinRule run_ lint_ (typeRep v) binary_} -- | Change the priority of a given set of rules, where higher priorities take precedence. -- All matching rules at a given priority must be disjoint, or an error is raised. -- All builtin Shake rules have priority between 0 and 1. -- Excessive use of 'priority' is discouraged. As an example: -- -- @ -- 'priority' 4 $ \"hello.*\" %> \\out -> 'writeFile'' out \"hello.*\" -- 'priority' 8 $ \"*.txt\" %> \\out -> 'writeFile'' out \"*.txt\" -- @ -- -- In this example @hello.txt@ will match the second rule, instead of raising an error about ambiguity. -- -- The 'priority' function obeys the invariants: -- -- @ -- 'priority' p1 ('priority' p2 r1) === 'priority' p1 r1 -- 'priority' p1 (r1 >> r2) === 'priority' p1 r1 >> 'priority' p1 r2 -- @ priority :: Double -> Rules () -> Rules () priority d = modifyRules $ \s -> s{userRules = Map.map f $ userRules s} where f (UserRule_ s) = UserRule_ $ Priority d s -- | Change the matching behaviour of rules so rules do not have to be disjoint, but are instead matched -- in order. Only recommended for small blocks containing a handful of rules. -- -- @ -- 'alternatives' $ do -- \"hello.*\" %> \\out -> 'writeFile'' out \"hello.*\" -- \"*.txt\" %> \\out -> 'writeFile'' out \"*.txt\" -- @ -- -- In this example @hello.txt@ will match the first rule, instead of raising an error about ambiguity. -- Inside 'alternatives' the 'priority' of each rule is not used to determine which rule matches, -- but the resulting match uses that priority compared to the rules outside the 'alternatives' block. alternatives :: Rules () -> Rules () alternatives = modifyRules $ \r -> r{userRules = Map.map f $ userRules r} where f (UserRule_ s) = UserRule_ $ Alternative s -- | Run an action, usually used for specifying top-level requirements. -- -- @ -- main = 'Development.Shake.shake' 'shakeOptions' $ do -- 'action' $ do -- b <- 'Development.Shake.doesFileExist' \"file.src\" -- when b $ 'Development.Shake.need' [\"file.out\"] -- @ -- -- This 'action' builds @file.out@, but only if @file.src@ exists. The 'action' -- will be run in every build execution (unless 'withoutActions' is used), so only cheap -- operations should be performed. All arguments to 'action' may be run in parallel, in any order. -- -- For the standard requirement of only 'Development.Shake.need'ing a fixed list of files in the 'action', -- see 'Development.Shake.want'. action :: Action a -> Rules () action a = newRules mempty{actions=newListBuilder $ void a} -- | Remove all actions specified in a set of rules, usually used for implementing -- command line specification of what to build. withoutActions :: Rules () -> Rules () withoutActions = modifyRules $ \x -> x{actions=mempty} shake-0.16.4/src/Development/Shake/Internal/Core/Rendezvous.hs0000644000000000000000000000547613261223301022340 0ustar0000000000000000{-# LANGUAGE ExistentialQuantification #-} module Development.Shake.Internal.Core.Rendezvous( Waiting, newWaiting, afterWaiting, Answer(..), Compute(..), rendezvous ) where import Control.Monad import Data.IORef.Extra import Data.Primitive.Array import Development.Shake.Internal.Errors -- | Given a sequence of 'Answer' values the sequence stops -- when there is a single 'Abort' or all values end up as 'Continue'. data Answer a c = Abort a | Continue c -- | A compuation that either has a result available immediate, -- or has a result that can be collected later. data Compute a = Now a | Later (Waiting a) partitionAnswer :: [Answer a c] -> ([a], [c]) partitionAnswer = foldr f ([],[]) where f (Abort a) ~(as,cs) = (a:as,cs) f (Continue c) ~(as,cs) = (as,c:cs) partitionCompute :: [Compute a] -> ([a], [Waiting a]) partitionCompute = foldr f ([],[]) where f (Now x) ~(xs,ws) = (x:xs,ws) f (Later w) ~(xs,ws) = (xs,w:ws) -- | A type representing someone waiting for a result. data Waiting a = forall b . Waiting (b -> a) (IORef (b -> IO ())) -- Contains a functor value to apply, along with somewhere to register callbacks instance Functor Waiting where fmap f (Waiting op ref) = Waiting (f . op) ref instance Show (Waiting a) where show _ = "Waiting" newWaiting :: IO (Waiting a, a -> IO ()) newWaiting = do ref <- newIORef $ \_ -> return () let run x = ($ x) =<< readIORef ref return (Waiting id ref, run) afterWaiting :: Waiting a -> (a -> IO ()) -> IO () afterWaiting (Waiting op ref) act = modifyIORef' ref (\a s -> a s >> act (op s)) rendezvous :: [Compute (Answer a c)] -> IO (Compute (Either a [c])) rendezvous xs = do let (now, later) = partitionCompute xs let (abort, continue) = partitionAnswer now if not $ null abort then return $ Now $ Left $ head abort else if null later then return $ Now $ Right continue else do (waiting, run) <- newWaiting let n = length xs result <- newArray n $ errorInternal "rendezvous" todo <- newIORef $ length later forM_ (zip [0..] xs) $ \(i,x) -> case x of Now (Continue c) -> writeArray result i c Later w -> afterWaiting w $ \v -> do t <- readIORef todo case v of _ | t == 0 -> return () -- must have already aborted Abort a -> do writeIORef todo 0 run $ Left a Continue c -> do writeArray result i c writeIORef' todo $ t-1 when (t == 1) $ do rs <- unsafeFreezeArray result run $ Right $ map (indexArray rs) [0..n-1] return $ Later waiting shake-0.16.4/src/Development/Shake/Internal/Core/Pool.hs0000644000000000000000000001550713261223301021101 0ustar0000000000000000 -- | Thread pool implementation. The three names correspond to the following -- priority levels (highest to lowest): -- -- * 'addPoolException' - things that probably result in a build error, -- so kick them off quickly. -- -- * 'addPoolResume' - things that started, blocked, and may have open -- resources in their closure. -- -- * 'addPoolStart' - rules that haven't yet started. -- -- * 'addPoolBatch' - rules that might batch if other rules start first. module Development.Shake.Internal.Core.Pool( Pool, runPool, addPoolException, addPoolResume, addPoolStart, addPoolBatch, increasePool ) where import Control.Concurrent.Extra import System.Time.Extra import Control.Exception import Control.Monad.Extra import General.Timing import General.Extra import qualified General.Bag as Bag import qualified Data.HashSet as Set --------------------------------------------------------------------- -- UNFAIR/RANDOM QUEUE data Queue a = Queue {queueException :: Bag.Bag a ,queueResume :: Bag.Bag a ,queueStart :: Bag.Bag a ,queueBatch :: Bag.Bag a } lensException = (queueException, \x v -> x{queueException=v}) lensResume = (queueResume, \x v -> x{queueResume=v}) lensStart = (queueStart, \x v -> x{queueStart=v}) lensBatch = (queueBatch, \x v -> x{queueBatch=v}) lenses = [lensException, lensResume, lensStart, lensBatch] newQueue :: Bool -> Queue a newQueue deterministic = Queue b b b b where b = if deterministic then Bag.emptyPure else Bag.emptyRandom dequeue :: Queue a -> Bag.Randomly (Maybe (a, Queue a)) dequeue q = firstJustM f lenses where f (sel, upd) | Just x <- Bag.remove $ sel q = do (x,b) <- x; return $ Just (x, upd q b) f _ = return Nothing --------------------------------------------------------------------- -- THREAD POOL {- Must keep a list of active threads, so can raise exceptions in a timely manner If any worker throws an exception, must signal to all the other workers -} data Pool = Pool !(Var (Maybe S)) -- Current state, 'Nothing' to say we are aborting !(Barrier (Either SomeException S)) -- Barrier to signal that we are finished data S = S {threads :: !(Set.HashSet ThreadId) -- IMPORTANT: Must be strict or we leak thread stacks ,threadsLimit :: {-# UNPACK #-} !Int -- user supplied thread limit, Set.size threads <= threadsLimit ,threadsMax :: {-# UNPACK #-} !Int -- high water mark of Set.size threads (accounting only) ,threadsSum :: {-# UNPACK #-} !Int -- number of threads we have been through (accounting only) ,todo :: !(Queue (IO ())) -- operations waiting a thread } emptyS :: Int -> Bool -> S emptyS n deterministic = S Set.empty n 0 0 $ newQueue deterministic worker :: Pool -> IO () worker pool@(Pool var done) = do let onVar act = modifyVar var $ maybe (return (Nothing, return ())) act join $ onVar $ \s -> do res <- dequeue $ todo s case res of Nothing -> return (Just s, return ()) Just (now, todo2) -> return (Just s{todo = todo2}, now >> worker pool) -- | Given a pool, and a function that breaks the S invariants, restore them -- They are only allowed to touch threadsLimit or todo step :: Pool -> (S -> Bag.Randomly S) -> IO () step pool@(Pool var done) op = do let onVar act = modifyVar_ var $ maybe (return Nothing) act onVar $ \s -> do s <- op s res <- dequeue $ todo s case res of Just (now, todo2) | Set.size (threads s) < threadsLimit s -> do -- spawn a new worker t <- forkFinallyUnmasked (now >> worker pool) $ \res -> case res of Left e -> onVar $ \s -> do t <- myThreadId mapM_ killThread $ Set.toList $ Set.delete t $ threads s signalBarrier done $ Left e return Nothing Right _ -> do t <- myThreadId step pool $ \s -> return s{threads = Set.delete t $ threads s} let threads2 = Set.insert t $ threads s return $ Just s{todo = todo2, threads = threads2 ,threadsSum = threadsSum s + 1, threadsMax = threadsMax s `max` Set.size threads2} Nothing | Set.null $ threads s -> do signalBarrier done $ Right s return Nothing _ -> return $ Just s addPool (sel, upd) pool act = step pool $ \s -> return s{todo = upd (todo s) $ Bag.insert (void act) $ sel $ todo s} -- | Add a new task to the pool. See the top of the module for the relative ordering -- and semantics. addPoolException, addPoolResume, addPoolStart :: Pool -> IO a -> IO () addPoolException = addPool lensException addPoolResume = addPool lensResume addPoolStart = addPool lensStart addPoolBatch = addPool lensBatch -- | Temporarily increase the pool by 1 thread. Call the cleanup action to restore the value. -- After calling cleanup you should requeue onto a new thread. increasePool :: Pool -> IO (IO ()) increasePool pool = do step pool $ \s -> return s{threadsLimit = threadsLimit s + 1} return $ step pool $ \s -> return s{threadsLimit = threadsLimit s - 1} -- | Run all the tasks in the pool on the given number of works. -- If any thread throws an exception, the exception will be reraised. -- When it completes all threads have either finished, or have had 'killThread' -- called on them (but may not have actually died yet). runPool :: Bool -> Int -> (Pool -> IO ()) -> IO () -- run all tasks in the pool runPool deterministic n act = do s <- newVar $ Just $ emptyS n deterministic done <- newBarrier let cleanup = modifyVar_ s $ \s -> do -- if someone kills our thread, make sure we kill our child threads case s of Just s -> mapM_ killThread $ Set.toList $ threads s Nothing -> return () return Nothing let ghc10793 = do -- if this thread dies because it is blocked on an MVar there's a chance we have -- a better error in the done barrier, and GHC raised the exception wrongly, see: -- https://ghc.haskell.org/trac/ghc/ticket/10793 sleep 1 -- give it a little bit of time for the finally to run -- no big deal, since the blocked indefinitely takes a while to fire anyway res <- waitBarrierMaybe done case res of Just (Left e) -> throwIO e _ -> throwIO BlockedIndefinitelyOnMVar handle (\BlockedIndefinitelyOnMVar -> ghc10793) $ flip onException cleanup $ do let pool = Pool s done addPoolStart pool $ act pool res <- waitBarrier done case res of Left e -> throwIO e Right s -> addTiming $ "Pool finished (" ++ show (threadsSum s) ++ " threads, " ++ show (threadsMax s) ++ " max)" shake-0.16.4/src/Development/Shake/Internal/Core/Monad.hs0000644000000000000000000000776213261223301021232 0ustar0000000000000000{-# LANGUAGE CPP #-} {-# LANGUAGE GADTs, ScopedTypeVariables #-} module Development.Shake.Internal.Core.Monad( RAW, Capture, runRAW, getRO, getRW, putRW, modifyRW, catchRAW, tryRAW, throwRAW, captureRAW, ) where import Control.Exception.Extra import Control.Monad.IO.Class import Data.IORef.Extra import Control.Applicative import Control.Monad import Prelude #if __GLASGOW_HASKELL__ >= 800 import Control.Monad.Fail #endif data RAW ro rw a where Fmap :: (a -> b) -> RAW ro rw a -> RAW ro rw b Pure :: a -> RAW ro rw a Ap :: RAW ro rw (a -> b) -> RAW ro rw a -> RAW ro rw b Next :: RAW ro rw a -> RAW ro rw b -> RAW ro rw b Bind :: RAW ro rw a -> (a -> RAW ro rw b) -> RAW ro rw b LiftIO :: IO a -> RAW ro rw a GetRO :: RAW ro rw ro GetRW :: RAW ro rw rw PutRW :: !rw -> RAW ro rw () ModifyRW :: (rw -> rw) -> RAW ro rw () CaptureRAW :: Capture (Either SomeException a) -> RAW ro rw a CatchRAW :: RAW ro rw a -> (SomeException -> RAW ro rw a) -> RAW ro rw a instance Functor (RAW ro rw) where fmap = Fmap instance Applicative (RAW ro rw) where pure = Pure (*>) = Next (<*>) = Ap instance Monad (RAW ro rw) where return = pure (>>) = (*>) (>>=) = Bind instance MonadIO (RAW ro rw) where liftIO = LiftIO #if __GLASGOW_HASKELL__ >= 800 instance MonadFail (RAW ro rw) where fail = liftIO . Control.Monad.Fail.fail #endif type Capture a = (a -> IO ()) -> IO () -- | Run and then call a continuation. runRAW :: ro -> rw -> RAW ro rw a -> Capture (Either SomeException a) runRAW ro rw m k = do rw <- newIORef rw handler <- newIORef $ k . Left goRAW handler ro rw m (k . Right) `catch_` \e -> ($ e) =<< readIORef handler goRAW :: forall ro rw a . IORef (SomeException -> IO ()) -> ro -> IORef rw -> RAW ro rw a -> Capture a goRAW handler ro rw = go where go :: RAW ro rw b -> Capture b go x k = case x of Fmap f a -> go a $ \v -> k $ f v Pure a -> k a Ap f x -> go f $ \f -> go x $ \v -> k $ f v Bind a b -> go a $ \a -> go (b a) k Next a b -> go a $ \_ -> go b k LiftIO x -> k =<< x GetRO -> k ro GetRW -> k =<< readIORef rw PutRW x -> writeIORef rw x >> k () ModifyRW f -> modifyIORef' rw f >> k () CatchRAW m hdl -> do old <- readIORef handler writeIORef handler $ \e -> do writeIORef handler old go (hdl e) k `catch_` \e -> ($ e) =<< readIORef handler go m $ \x -> writeIORef handler old >> k x CaptureRAW f -> do old <- readIORef handler writeIORef handler throwIO f $ \x -> case x of Left e -> old e Right v -> do writeIORef handler old k v `catch_` \e -> ($ e) =<< readIORef handler writeIORef handler throwIO --------------------------------------------------------------------- -- STANDARD getRO :: RAW ro rw ro getRO = GetRO getRW :: RAW ro rw rw getRW = GetRW -- | Strict version putRW :: rw -> RAW ro rw () putRW = PutRW modifyRW :: (rw -> rw) -> RAW ro rw () modifyRW = ModifyRW --------------------------------------------------------------------- -- EXCEPTIONS catchRAW :: RAW ro rw a -> (SomeException -> RAW ro rw a) -> RAW ro rw a catchRAW = CatchRAW tryRAW :: RAW ro rw a -> RAW ro rw (Either SomeException a) tryRAW m = catchRAW (fmap Right m) (return . Left) throwRAW :: Exception e => e -> RAW ro rw a throwRAW = liftIO . throwIO --------------------------------------------------------------------- -- CONTINUATIONS -- | Capture a continuation. The continuation should be called at most once. -- Calling the same continuation, multiple times, in parallel, results in incorrect behaviour. captureRAW :: Capture (Either SomeException a) -> RAW ro rw a captureRAW = CaptureRAW shake-0.16.4/src/Development/Shake/Internal/Core/Database.hs0000644000000000000000000005341513261223301021674 0ustar0000000000000000{-# LANGUAGE RecordWildCards, PatternGuards, DeriveFunctor #-} {-# LANGUAGE Rank2Types, FlexibleInstances #-} {-# LANGUAGE DeriveDataTypeable, GeneralizedNewtypeDeriving #-} module Development.Shake.Internal.Core.Database( Trace(..), newTrace, Database, withDatabase, assertFinishedDatabase, listDepends, lookupDependencies, lookupStatus, BuildKey(..), build, Depends, nubDepends, Step, Result(..), progress, Stack, emptyStack, topStack, showStack, showTopStack, toReport, checkValid, listLive ) where import Development.Shake.Classes import General.Binary import Development.Shake.Internal.Core.Pool import Development.Shake.Internal.Value import Development.Shake.Internal.Errors import Development.Shake.Internal.Core.Storage import Development.Shake.Internal.Options import Development.Shake.Internal.Profile import Development.Shake.Internal.Core.Monad import Development.Shake.Internal.Core.Rendezvous import qualified Data.ByteString.Char8 as BS import Data.Word import General.Extra import qualified General.Intern as Intern import General.Intern(Id, Intern) import Numeric.Extra import Control.Applicative import Control.Exception import Control.Monad.Extra import Control.Concurrent.Extra import qualified Data.HashSet as Set import qualified Data.HashMap.Strict as Map import qualified General.Ids as Ids import Foreign.Storable import Data.Typeable.Extra import Data.IORef.Extra import Data.Maybe import Data.List import Data.Tuple.Extra import Data.Either.Extra import System.Time.Extra import Data.Monoid import Prelude type Map = Map.HashMap --------------------------------------------------------------------- -- UTILITY TYPES newtype Step = Step Word32 deriving (Eq,Ord,Show,Storable,BinaryEx,NFData,Hashable,Typeable) incStep (Step i) = Step $ i + 1 --------------------------------------------------------------------- -- CALL STACK -- Invariant: Stack xs set . HashSet.fromList (map fst xs) == set data Stack = Stack [(Id,Key)] !(Set.HashSet Id) showStack :: Stack -> [String] showStack (Stack xs _) = reverse $ map (show . snd) xs showTopStack :: Stack -> String showTopStack = maybe "" show . topStack addStack :: Id -> Key -> Stack -> Stack addStack x key (Stack xs set) = Stack ((x,key):xs) (Set.insert x set) topStack :: Stack -> Maybe Key topStack (Stack xs _) = snd <$> listToMaybe xs checkStack :: [Id] -> Stack -> Maybe (Id,Key) checkStack new (Stack xs set) | bad:_ <- filter (`Set.member` set) new = Just (bad, fromJust $ lookup bad xs) | otherwise = Nothing emptyStack :: Stack emptyStack = Stack [] Set.empty --------------------------------------------------------------------- -- TRACE data Trace = Trace {-# UNPACK #-} !BS.ByteString {-# UNPACK #-} !Float {-# UNPACK #-} !Float -- ^ (message, start, end) deriving Show instance NFData Trace where rnf x = x `seq` () -- all strict atomic fields newTrace :: String -> Double -> Double -> Trace newTrace msg start stop = Trace (BS.pack msg) (doubleToFloat start) (doubleToFloat stop) --------------------------------------------------------------------- -- CENTRAL TYPES type StatusDB = Ids.Ids (Key, Status) type InternDB = IORef (Intern Key) -- | Invariant: The database does not have any cycles where a Key depends on itself data Database = Database {lock :: Lock ,intern :: InternDB ,status :: StatusDB ,step :: {-# UNPACK #-} !Step ,journal :: Id -> Key -> Result BS.ByteString -> IO () ,diagnostic :: IO String -> IO () -- ^ logging function } data Status = Ready (Result Value) -- ^ I have a value | Error SomeException -- ^ I have been run and raised an error | Loaded (Result BS.ByteString) -- ^ Loaded from the database | Waiting (Waiting Status) (Maybe (Result BS.ByteString)) -- ^ Currently checking if I am valid or building | Missing -- ^ I am only here because I got into the Intern table deriving Show instance NFData Status where rnf x = case x of Ready x -> rnfResult rnf x Error x -> rnf $ show x -- Best I can do for arbitrary exceptions Loaded x -> rnfResult id x Waiting _ x -> maybe () (rnfResult id) x -- Can't RNF a waiting, but also unnecessary Missing -> () where -- ignore the unpacked fields -- complex because ByteString lacks NFData in GHC 7.4 and below rnfResult by (Result a _ _ b _ c) = by a `seq` rnf b `seq` rnf c `seq` () {-# INLINE rnfResult #-} data Result a = Result {result :: a -- ^ the result associated with the Key ,built :: {-# UNPACK #-} !Step -- ^ when it was actually run ,changed :: {-# UNPACK #-} !Step -- ^ the step for deciding if it's valid ,depends :: [Depends] -- ^ dependencies (don't run them early) ,execution :: {-# UNPACK #-} !Float -- ^ how long it took when it was last run (seconds) ,traces :: [Trace] -- ^ a trace of the expensive operations (start/end in seconds since beginning of run) } deriving (Show,Functor) statusType Ready{} = "Ready" statusType Error{} = "Error" statusType Loaded{} = "Loaded" statusType Waiting{} = "Waiting" statusType Missing{} = "Missing" getResult :: Status -> Maybe (Result (Either BS.ByteString Value)) getResult (Ready r) = Just $ Right <$> r getResult (Loaded r) = Just $ Left <$> r getResult (Waiting _ r) = fmap Left <$> r getResult _ = Nothing --------------------------------------------------------------------- -- OPERATIONS newtype Depends = Depends {fromDepends :: [Id]} deriving NFData instance Show Depends where -- Appears in diagnostic output and the Depends ctor is just verbose show = show . fromDepends -- | Afterwards each Id must occur at most once and there are no empty Depends nubDepends :: [Depends] -> [Depends] nubDepends = fMany Set.empty where fMany seen [] = [] fMany seen (Depends d:ds) = [Depends d2 | d2 /= []] ++ fMany seen2 ds where (d2,seen2) = fOne seen d fOne seen [] = ([], seen) fOne seen (x:xs) | x `Set.member` seen = fOne seen xs fOne seen (x:xs) = first (x:) $ fOne (Set.insert x seen) xs newtype BuildKey = BuildKey {buildKey :: Stack -- Given the current stack with the key added on -> Step -- And the current step -> Key -- The key to build -> Maybe (Result BS.ByteString) -- A previous result, or Nothing if never been built before -> Bool -- True if any of the children were dirty -> Capture (Either SomeException (Bool, BS.ByteString, Result Value)) -- Either an error, or a result. -- If the Bool is True you should rewrite the database entry. } type Returns a = forall b . (a -> IO b) -> (Capture a -> IO b) -> IO b internKey :: InternDB -> StatusDB -> Key -> IO Id internKey intern status k = do is <- readIORef intern case Intern.lookup k is of Just i -> return i Nothing -> do (is, i) <- return $ Intern.add k is writeIORef' intern is Ids.insert status i (k,Missing) return i lookupStatus :: Database -> Key -> IO (Maybe (Either BS.ByteString Value)) lookupStatus Database{..} k = withLock lock $ do i <- internKey intern status k maybe Nothing (fmap result . getResult . snd) <$> Ids.lookup status i -- | Return either an exception (crash), or (how much time you spent waiting, the value) build :: Pool -> Database -> BuildKey -> Stack -> [Key] -> Capture (Either SomeException (Seconds,Depends,[Value])) build pool Database{..} BuildKey{..} stack ks continue = join $ withLock lock $ do is <- forM ks $ internKey intern status buildMany stack is (\v -> case v of Error e -> Just e; _ -> Nothing) (\v -> return $ continue $ case v of Left e -> Left e Right rs -> Right (0, Depends is, map result rs)) $ \go -> do -- only bother doing the stack check if we're actually going to build stuff whenJust (checkStack is stack) $ \(badId, badKey) -> -- everything else gets thrown via Left and can be Staunch'd -- recursion in the rules is considered a worse error, so fails immediately errorRuleRecursion (showStack stack ++ [show badKey]) (typeKey badKey) (show badKey) time <- offsetTime go $ \x -> case x of Left e -> addPoolException pool $ continue $ Left e Right rs -> addPoolResume pool $ do dur <- time; continue $ Right (dur, Depends is, map result rs) return $ return () where (#=) :: Id -> (Key, Status) -> IO Status i #= (k,v) = do diagnostic $ do old <- Ids.lookup status i return $ maybe "Missing" (statusType . snd) old ++ " -> " ++ statusType v ++ ", " ++ maybe "" (show . fst) old Ids.insert status i (k,v) return v buildMany :: Stack -> [Id] -> (Status -> Maybe a) -> Returns (Either a [Result Value]) buildMany stack is test fast slow = do let toAnswer v | Just v <- test v = Abort v toAnswer (Ready v) = Continue v let toCompute (Waiting w _) = Later $ toAnswer <$> w toCompute x = Now $ toAnswer x res <- rendezvous =<< mapM (fmap toCompute . reduce stack) is case res of Now v -> fast v Later w -> slow $ \slow -> afterWaiting w slow -- Rules for each of the following functions -- * Must NOT lock -- * Must have an equal return to what is stored in the db at that point -- * Must return one of the designated subset of values reduce :: Stack -> Id -> IO Status {- Ready | Error | Waiting -} reduce stack i = do s <- Ids.lookup status i case s of Nothing -> errorInternal $ "interned value missing from database, " ++ show i Just (k, Missing) -> spawn True stack i k Nothing Just (k, Loaded r) -> check stack i k r (depends r) Just (k, res) -> return res -- | Given a Key and the list of dependencies yet to be checked, check them check :: Stack -> Id -> Key -> Result BS.ByteString -> [Depends] -> IO Status {- Ready | Waiting -} check stack i k r [] = spawn False stack i k $ Just r check stack i k r (Depends ds:rest) = do let cont v = if isLeft v then spawn True stack i k $ Just r else check stack i k r rest buildMany (addStack i k stack) ds (\v -> case v of Error _ -> Just () Ready dep | changed dep > built r -> Just () _ -> Nothing) cont $ \go -> do (self, done) <- newWaiting go $ \v -> do res <- cont v case res of Waiting w _ -> afterWaiting w done _ -> done res i #= (k, Waiting self $ Just r) -- | Given a Key, queue up execution and return waiting spawn :: Bool -> Stack -> Id -> Key -> Maybe (Result BS.ByteString) -> IO Status {- Waiting -} spawn dirtyChildren stack i k r = do (w, done) <- newWaiting addPoolStart pool $ buildKey (addStack i k stack) step k r dirtyChildren $ \res -> do let status = either Error (Ready . thd3) res withLock lock $ do i #= (k, status) done status case res of Right (write, bs, r) -> do diagnostic $ return $ "result " ++ showBracket k ++ " = "++ showBracket (result r) ++ " " ++ (if built r == changed r then "(changed)" else "(unchanged)") when write $ journal i k r{result=bs} Left _ -> diagnostic $ return $ "result " ++ showBracket k ++ " = error" i #= (k, Waiting w r) --------------------------------------------------------------------- -- PROGRESS progress :: Database -> IO Progress progress Database{..} = do xs <- Ids.toList status return $! foldl' f mempty $ map (snd . snd) xs where g = floatToDouble f s (Ready Result{..}) = if step == built then s{countBuilt = countBuilt s + 1, timeBuilt = timeBuilt s + g execution} else s{countSkipped = countSkipped s + 1, timeSkipped = timeSkipped s + g execution} f s (Loaded Result{..}) = s{countUnknown = countUnknown s + 1, timeUnknown = timeUnknown s + g execution} f s (Waiting _ r) = let (d,c) = timeTodo s t | Just Result{..} <- r = let d2 = d + g execution in d2 `seq` (d2,c) | otherwise = let c2 = c + 1 in c2 `seq` (d,c2) in s{countTodo = countTodo s + 1, timeTodo = t} f s _ = s --------------------------------------------------------------------- -- QUERY DATABASE assertFinishedDatabase :: Database -> IO () assertFinishedDatabase Database{..} = do -- if you have anyone Waiting, and are not exiting with an error, then must have a complex recursion (see #400) status <- Ids.toList status let bad = [key | (_, (key, Waiting{})) <- status] when (bad /= []) $ errorComplexRecursion (map show bad) -- | Given a map of representing a dependency order (with a show for error messages), find an ordering for the items such -- that no item points to an item before itself. -- Raise an error if you end up with a cycle. dependencyOrder :: (Eq a, Hashable a) => (a -> String) -> Map a [a] -> [a] -- Algorithm: -- Divide everyone up into those who have no dependencies [Id] -- And those who depend on a particular Id, Dep :-> Maybe [(Key,[Dep])] -- Where d :-> Just (k, ds), k depends on firstly d, then remaining on ds -- For each with no dependencies, add to list, then take its dep hole and -- promote them either to Nothing (if ds == []) or into a new slot. -- k :-> Nothing means the key has already been freed dependencyOrder shw status = f (map fst noDeps) $ Map.map Just $ Map.fromListWith (++) [(d, [(k,ds)]) | (k,d:ds) <- hasDeps] where (noDeps, hasDeps) = partition (null . snd) $ Map.toList status f [] mp | null bad = [] | otherwise = error $ unlines $ "Internal invariant broken, database seems to be cyclic" : map (" " ++) bad ++ ["... plus " ++ show (length badOverflow) ++ " more ..." | not $ null badOverflow] where (bad,badOverflow) = splitAt 10 [shw i | (i, Just _) <- Map.toList mp] f (x:xs) mp = x : f (now++xs) later where Just free = Map.lookupDefault (Just []) x mp (now,later) = foldl' g ([], Map.insert x Nothing mp) free g (free, mp) (k, []) = (k:free, mp) g (free, mp) (k, d:ds) = case Map.lookupDefault (Just []) d mp of Nothing -> g (free, mp) (k, ds) Just todo -> (free, Map.insert d (Just $ (k,ds) : todo) mp) -- | Eliminate all errors from the database, pretending they don't exist resultsOnly :: Map Id (Key, Status) -> Map Id (Key, Result (Either BS.ByteString Value)) resultsOnly mp = Map.map (\(k, v) -> (k, let Just r = getResult v in r{depends = map (Depends . filter (isJust . flip Map.lookup keep) . fromDepends) $ depends r})) keep where keep = Map.filter (isJust . getResult . snd) mp removeStep :: Map Id (Key, Result a) -> Map Id (Key, Result a) removeStep = Map.filter (\(k,_) -> k /= stepKey) toReport :: Database -> IO [ProfileEntry] toReport Database{..} = do status <- removeStep . resultsOnly <$> Ids.toMap status let order = let shw i = maybe "" (show . fst) $ Map.lookup i status in dependencyOrder shw $ Map.map (concatMap fromDepends . depends . snd) status ids = Map.fromList $ zip order [0..] steps = let xs = Set.toList $ Set.fromList $ concat [[changed, built] | (_,Result{..}) <- Map.elems status] in Map.fromList $ zip (sortBy (flip compare) xs) [0..] f (k, Result{..}) = ProfileEntry {prfName = show k ,prfBuilt = fromStep built ,prfChanged = fromStep changed ,prfDepends = mapMaybe (`Map.lookup` ids) (concatMap fromDepends depends) ,prfExecution = floatToDouble execution ,prfTraces = map fromTrace traces } where fromStep i = fromJust $ Map.lookup i steps fromTrace (Trace a b c) = ProfileTrace (BS.unpack a) (floatToDouble b) (floatToDouble c) return [maybe (errorInternal "toReport") f $ Map.lookup i status | i <- order] checkValid :: Database -> (Key -> Value -> IO (Maybe String)) -> [(Key, Key)] -> IO () checkValid Database{..} check missing = do status <- Ids.toList status intern <- readIORef intern diagnostic $ return "Starting validity/lint checking" -- Do not use a forM here as you use too much stack space bad <- (\f -> foldM f [] status) $ \seen (i,v) -> case v of (key, Ready Result{..}) -> do good <- check key result diagnostic $ return $ "Checking if " ++ show key ++ " is " ++ show result ++ ", " ++ if isNothing good then "passed" else "FAILED" return $ [(key, result, now) | Just now <- [good]] ++ seen _ -> return seen unless (null bad) $ do let n = length bad errorStructured ("Lint checking error - " ++ (if n == 1 then "value has" else show n ++ " values have") ++ " changed since being depended upon") (intercalate [("",Just "")] [ [("Key", Just $ show key),("Old", Just $ show result),("New", Just now)] | (key, result, now) <- bad]) "" bad <- return [(parent,key) | (parent, key) <- missing, isJust $ Intern.lookup key intern] unless (null bad) $ do let n = length bad errorStructured ("Lint checking error - " ++ (if n == 1 then "value" else show n ++ " values") ++ " did not have " ++ (if n == 1 then "its" else "their") ++ " creation tracked") (intercalate [("",Just "")] [ [("Rule", Just $ show parent), ("Created", Just $ show key)] | (parent,key) <- bad]) "" diagnostic $ return "Validity/lint check passed" listLive :: Database -> IO [Key] listLive Database{..} = do diagnostic $ return "Listing live keys" status <- Ids.toList status return [k | (_, (k, Ready{})) <- status] listDepends :: Database -> Depends -> IO [Key] listDepends Database{..} (Depends xs) = withLock lock $ forM xs $ \x -> fst . fromJust <$> Ids.lookup status x lookupDependencies :: Database -> Key -> IO [Key] lookupDependencies Database{..} k = withLock lock $ do intern <- readIORef intern let Just i = Intern.lookup k intern Just (_, Ready r) <- Ids.lookup status i forM (concatMap fromDepends $ depends r) $ \x -> fst . fromJust <$> Ids.lookup status x --------------------------------------------------------------------- -- STORAGE -- To simplify journaling etc we smuggle the Step in the database, with a special StepKey newtype StepKey = StepKey () deriving (Show,Eq,Typeable,Hashable,Binary,BinaryEx,NFData) stepKey :: Key stepKey = newKey $ StepKey () toStepResult :: Step -> Result BS.ByteString toStepResult i = Result (runBuilder $ putEx i) i i [] 0 [] fromStepResult :: Result BS.ByteString -> Step fromStepResult = getEx . result withDatabase :: ShakeOptions -> (IO String -> IO ()) -> Map TypeRep (BinaryOp Key) -> (Database -> IO a) -> IO a withDatabase opts diagnostic witness act = do let step = (typeRep (Proxy :: Proxy StepKey), BinaryOp (const mempty) (const stepKey)) witness <- return $ Map.fromList [ (QTypeRep t, BinaryOp (putDatabase putOp) (getDatabase getOp)) | (t,BinaryOp{..}) <- step : Map.toList witness] withStorage opts diagnostic witness $ \status journal -> do journal <- return $ \i k v -> journal (QTypeRep $ typeKey k) i (k, Loaded v) xs <- Ids.toList status let mp1 = Intern.fromList [(k, i) | (i, (k,_)) <- xs] (mp1, stepId) <- case Intern.lookup stepKey mp1 of Just stepId -> return (mp1, stepId) Nothing -> do (mp1, stepId) <- return $ Intern.add stepKey mp1 return (mp1, stepId) intern <- newIORef mp1 step <- do v <- Ids.lookup status stepId return $ case v of Just (_, Loaded r) -> incStep $ fromStepResult r _ -> Step 1 journal stepId stepKey $ toStepResult step lock <- newLock act Database{..} putDatabase :: (Key -> Builder) -> ((Key, Status) -> Builder) putDatabase putKey (key, Loaded (Result x1 x2 x3 x4 x5 x6)) = putExN (putKey key) <> putExN (putEx x1) <> putEx x2 <> putEx x3 <> putEx x5 <> putExN (putEx x4) <> putEx x6 putDatabase _ (_, x) = errorInternal $ "putWith, Cannot write Status with constructor " ++ statusType x getDatabase :: (BS.ByteString -> Key) -> BS.ByteString -> (Key, Status) getDatabase getKey bs | (key, bs) <- getExN bs , (x1, bs) <- getExN bs , (x2, x3, x5, bs) <- binarySplit3 bs , (x4, x6) <- getExN bs = (getKey key, Loaded (Result x1 x2 x3 (getEx x4) x5 (getEx x6))) instance BinaryEx Depends where putEx (Depends xs) = putExStorableList xs getEx = Depends . getExStorableList instance BinaryEx [Depends] where putEx = putExList . map putEx getEx = map getEx . getExList instance BinaryEx Trace where putEx (Trace a b c) = putEx b <> putEx c <> putEx a getEx x | (b,c,a) <- binarySplit2 x = Trace a b c instance BinaryEx [Trace] where putEx = putExList . map putEx getEx = map getEx . getExList shake-0.16.4/src/Development/Shake/Internal/Core/Action.hs0000644000000000000000000002364513261223301021407 0ustar0000000000000000{-# LANGUAGE RecordWildCards, NamedFieldPuns, ScopedTypeVariables, ConstraintKinds #-} module Development.Shake.Internal.Core.Action( runAction, actionOnException, actionFinally, getShakeOptions, getProgress, runAfter, trackUse, trackChange, trackAllow, trackCheckUsed, getVerbosity, putWhen, putLoud, putNormal, putQuiet, withVerbosity, quietly, blockApply, unsafeAllowApply, traced ) where import Control.Exception import Control.Applicative import Control.Monad.Extra import Control.Monad.IO.Class import Control.DeepSeq import Data.Typeable.Extra import Data.Function import Data.Either.Extra import Data.Maybe import Data.IORef import Data.List import System.IO.Extra import Development.Shake.Internal.Core.Database import Development.Shake.Internal.Core.Monad import Development.Shake.Internal.Core.Types import Development.Shake.Internal.Value import Development.Shake.Internal.Options import Development.Shake.Internal.Errors import General.Cleanup import Prelude --------------------------------------------------------------------- -- RAW WRAPPERS runAction :: Global -> Local -> Action a -> Capture (Either SomeException a) runAction g l (Action x) = runRAW g l x -- | Apply a modification, run an action, then run an undo action after. -- Doesn't actually require exception handling because we don't have the ability to catch exceptions to the user. actionBracket :: (Local -> (Local, Local -> Local)) -> Action a -> Action a actionBracket f m = Action $ do s <- getRW let (s2,undo) = f s putRW s2 res <- fromAction m modifyRW undo return res --------------------------------------------------------------------- -- EXCEPTION HANDLING actionBoom :: Bool -> Action a -> IO b -> Action a actionBoom runOnSuccess act clean = do Global{..} <- Action getRO undo <- liftIO $ addCleanup globalCleanup $ void clean -- important to mask_ the undo/clean combo so either both happen or neither res <- Action $ catchRAW (fromAction act) $ \e -> liftIO (mask_ $ undo >> clean) >> throwRAW e liftIO $ mask_ $ undo >> when runOnSuccess (void clean) return res -- | If an exception is raised by the 'Action', perform some 'IO'. actionOnException :: Action a -> IO b -> Action a actionOnException = actionBoom False -- | After an 'Action', perform some 'IO', even if there is an exception. actionFinally :: Action a -> IO b -> Action a actionFinally = actionBoom True --------------------------------------------------------------------- -- QUERIES -- | Get the initial 'ShakeOptions', these will not change during the build process. getShakeOptions :: Action ShakeOptions getShakeOptions = Action $ globalOptions <$> getRO -- | Get the current 'Progress' structure, as would be returned by 'shakeProgress'. getProgress :: Action Progress getProgress = do Global{..} <- Action getRO liftIO globalProgress -- | Specify an action to be run after the database has been closed, if building completes successfully. runAfter :: IO () -> Action () runAfter op = do Global{..} <- Action getRO liftIO $ atomicModifyIORef globalAfter $ \ops -> (op:ops, ()) --------------------------------------------------------------------- -- VERBOSITY putWhen :: Verbosity -> String -> Action () putWhen v msg = do Global{..} <- Action getRO verb <- getVerbosity when (verb >= v) $ liftIO $ globalOutput v msg -- | Write an unimportant message to the output, only shown when 'shakeVerbosity' is higher than normal ('Loud' or above). -- The output will not be interleaved with any other Shake messages (other than those generated by system commands). putLoud :: String -> Action () putLoud = putWhen Loud -- | Write a normal priority message to the output, only supressed when 'shakeVerbosity' is 'Quiet' or 'Silent'. -- The output will not be interleaved with any other Shake messages (other than those generated by system commands). putNormal :: String -> Action () putNormal = putWhen Normal -- | Write an important message to the output, only supressed when 'shakeVerbosity' is 'Silent'. -- The output will not be interleaved with any other Shake messages (other than those generated by system commands). putQuiet :: String -> Action () putQuiet = putWhen Quiet -- | Get the current verbosity level, originally set by 'shakeVerbosity'. If you -- want to output information to the console, you are recommended to use -- 'putLoud' \/ 'putNormal' \/ 'putQuiet', which ensures multiple messages are -- not interleaved. The verbosity can be modified locally by 'withVerbosity'. getVerbosity :: Action Verbosity getVerbosity = Action $ localVerbosity <$> getRW -- | Run an action with a particular verbosity level. -- Will not update the 'shakeVerbosity' returned by 'getShakeOptions' and will -- not have any impact on 'Diagnostic' tracing. withVerbosity :: Verbosity -> Action a -> Action a withVerbosity new = actionBracket $ \s0 -> (s0{localVerbosity=new}, \s -> s{localVerbosity=localVerbosity s0}) -- | Run an action with 'Quiet' verbosity, in particular messages produced by 'traced' -- (including from 'Development.Shake.cmd' or 'Development.Shake.command') will not be printed to the screen. -- Will not update the 'shakeVerbosity' returned by 'getShakeOptions' and will -- not turn off any 'Diagnostic' tracing. quietly :: Action a -> Action a quietly = withVerbosity Quiet --------------------------------------------------------------------- -- BLOCK APPLY unsafeAllowApply :: Action a -> Action a unsafeAllowApply = applyBlockedBy Nothing blockApply :: String -> Action a -> Action a blockApply = applyBlockedBy . Just applyBlockedBy :: Maybe String -> Action a -> Action a applyBlockedBy reason = actionBracket $ \s0 -> (s0{localBlockApply=reason}, \s -> s{localBlockApply=localBlockApply s0}) --------------------------------------------------------------------- -- TRACING -- | Write an action to the trace list, along with the start/end time of running the IO action. -- The 'Development.Shake.cmd' and 'Development.Shake.command' functions automatically call 'traced'. -- The trace list is used for profile reports (see 'shakeReport'). -- -- By default 'traced' prints some useful extra context about what -- Shake is building, e.g.: -- -- > # traced message (for myobject.o) -- -- To suppress the output of 'traced' (for example you want more control -- over the message using 'putNormal'), use the 'quietly' combinator. traced :: String -> IO a -> Action a traced msg act = do Global{..} <- Action getRO Local{localStack} <- Action getRW start <- liftIO globalTimestamp putNormal $ "# " ++ msg ++ " (for " ++ showTopStack localStack ++ ")" res <- liftIO act stop <- liftIO globalTimestamp let trace = newTrace msg start stop liftIO $ evaluate $ rnf trace Action $ modifyRW $ \s -> s{localTraces = trace : localTraces s} return res --------------------------------------------------------------------- -- TRACKING -- | Track that a key has been used by the action preceeding it. trackUse :: ShakeValue key => key -> Action () -- One of the following must be true: -- 1) you are the one building this key (e.g. key == topStack) -- 2) you have already been used by apply, and are on the dependency list -- 3) someone explicitly gave you permission with trackAllow -- 4) at the end of the rule, a) you are now on the dependency list, and b) this key itself has no dependencies (is source file) trackUse key = do let k = newKey key Global{..} <- Action getRO l@Local{..} <- Action getRW deps <- liftIO $ concatMapM (listDepends globalDatabase) localDepends let top = topStack localStack if top == Just k then return () -- condition 1 else if k `elem` deps then return () -- condition 2 else if any ($ k) localTrackAllows then return () -- condition 3 else Action $ putRW l{localTrackUsed = k : localTrackUsed} -- condition 4 trackCheckUsed :: Action () trackCheckUsed = do Global{..} <- Action getRO Local{..} <- Action getRW liftIO $ do deps <- concatMapM (listDepends globalDatabase) localDepends -- check 3a bad <- return $ localTrackUsed \\ deps unless (null bad) $ do let n = length bad errorStructured ("Lint checking error - " ++ (if n == 1 then "value was" else show n ++ " values were") ++ " used but not depended upon") [("Used", Just $ show x) | x <- bad] "" -- check 3b bad <- flip filterM localTrackUsed $ \k -> not . null <$> lookupDependencies globalDatabase k unless (null bad) $ do let n = length bad errorStructured ("Lint checking error - " ++ (if n == 1 then "value was" else show n ++ " values were") ++ " depended upon after being used") [("Used", Just $ show x) | x <- bad] "" -- | Track that a key has been changed by the action preceding it. trackChange :: ShakeValue key => key -> Action () -- One of the following must be true: -- 1) you are the one building this key (e.g. key == topStack) -- 2) someone explicitly gave you permission with trackAllow -- 3) this file is never known to the build system, at the end it is not in the database trackChange key = do let k = newKey key Global{..} <- Action getRO Local{..} <- Action getRW liftIO $ do let top = topStack localStack if top == Just k then return () -- condition 1 else if any ($ k) localTrackAllows then return () -- condition 2 else -- condition 3 atomicModifyIORef globalTrackAbsent $ \ks -> ((fromMaybe k top, k):ks, ()) -- | Allow any matching key to violate the tracking rules. trackAllow :: ShakeValue key => (key -> Bool) -> Action () trackAllow (test :: key -> Bool) = Action $ modifyRW $ \s -> s{localTrackAllows = f : localTrackAllows s} where tk = typeRep (Proxy :: Proxy key) f k = typeKey k == tk && test (fromKey k) shake-0.16.4/src/Development/Ninja/0000755000000000000000000000000013261223301015164 5ustar0000000000000000shake-0.16.4/src/Development/Ninja/Type.hs0000644000000000000000000000350013261223301016437 0ustar0000000000000000{-# LANGUAGE TupleSections #-} -- | The IO in this module is only to evaluate an envrionment variable, -- the 'Env' itself it passed around purely. module Development.Ninja.Type( Str, FileStr, Expr(..), Env, newEnv, askVar, askExpr, addEnv, addBind, addBinds, Ninja(..), newNinja, Build(..), Rule(..), ) where import Control.Applicative import Development.Ninja.Env import qualified Data.ByteString.Char8 as BS import Data.Maybe import Prelude type Str = BS.ByteString type FileStr = Str --------------------------------------------------------------------- -- EXPRESSIONS AND BINDINGS data Expr = Exprs [Expr] | Lit Str | Var Str deriving (Show,Eq) askExpr :: Env Str Str -> Expr -> IO Str askExpr e = f where f (Exprs xs) = BS.concat <$> mapM f xs f (Lit x) = return x f (Var x) = askVar e x askVar :: Env Str Str -> Str -> IO Str askVar e x = fromMaybe BS.empty <$> askEnv e x addBind :: Env Str Str -> Str -> Expr -> IO () addBind e k v = addEnv e k =<< askExpr e v addBinds :: Env Str Str -> [(Str, Expr)] -> IO () addBinds e bs = do bs <- mapM (\(a,b) -> (a,) <$> askExpr e b) bs mapM_ (uncurry $ addEnv e) bs --------------------------------------------------------------------- -- STRUCTURE data Ninja = Ninja {sources :: [FilePath] ,rules :: [(Str,Rule)] ,singles :: [(FileStr,Build)] ,multiples :: [([FileStr], Build)] ,phonys :: [(Str, [FileStr])] ,defaults :: [FileStr] ,pools :: [(Str, Int)] } deriving Show newNinja :: Ninja newNinja = Ninja [] [] [] [] [] [] [] data Build = Build {ruleName :: Str ,env :: Env Str Str ,depsNormal :: [FileStr] ,depsImplicit :: [FileStr] ,depsOrderOnly :: [FileStr] ,buildBind :: [(Str,Str)] } deriving Show newtype Rule = Rule {ruleBind :: [(Str,Expr)] } deriving Show shake-0.16.4/src/Development/Ninja/Parse.hs0000644000000000000000000000545513261223301016603 0ustar0000000000000000{-# LANGUAGE RecordWildCards, TupleSections #-} module Development.Ninja.Parse(parse) where import qualified Data.ByteString.Char8 as BS import Development.Ninja.Env import Development.Ninja.Type import Development.Ninja.Lexer import Control.Applicative import Control.Monad import Prelude parse :: FilePath -> Env Str Str -> IO Ninja parse file env = parseFile file env newNinja parseFile :: FilePath -> Env Str Str -> Ninja -> IO Ninja parseFile file env ninja = do lexes <- lexerFile $ if file == "-" then Nothing else Just file foldM (applyStmt env) ninja{sources=file:sources ninja} $ withBinds lexes withBinds :: [Lexeme] -> [(Lexeme, [(Str,Expr)])] withBinds [] = [] withBinds (x:xs) = (x,a) : withBinds b where (a,b) = f xs f (LexBind a b : rest) = let (as,bs) = f rest in ((a,b):as, bs) f xs = ([], xs) applyStmt :: Env Str Str -> Ninja -> (Lexeme, [(Str,Expr)]) -> IO Ninja applyStmt env ninja@Ninja{..} (key, binds) = case key of LexBuild outputs rule deps -> do outputs <- mapM (askExpr env) outputs deps <- mapM (askExpr env) deps binds <- mapM (\(a,b) -> (a,) <$> askExpr env b) binds let (normal,implicit,orderOnly) = splitDeps deps let build = Build rule env normal implicit orderOnly binds return $ if rule == BS.pack "phony" then ninja{phonys = [(x, normal ++ implicit ++ orderOnly) | x <- outputs] ++ phonys} else if length outputs == 1 then ninja{singles = (head outputs, build) : singles} else ninja{multiples = (outputs, build) : multiples} LexRule name -> return ninja{rules = (name, Rule binds) : rules} LexDefault xs -> do xs <- mapM (askExpr env) xs return ninja{defaults = xs ++ defaults} LexPool name -> do depth <- getDepth env binds return ninja{pools = (name, depth) : pools} LexInclude expr -> do file <- askExpr env expr parseFile (BS.unpack file) env ninja LexSubninja expr -> do file <- askExpr env expr e <- scopeEnv env parseFile (BS.unpack file) e ninja LexDefine a b -> do addBind env a b return ninja LexBind a _ -> error $ "Unexpected binding defining " ++ BS.unpack a splitDeps :: [Str] -> ([Str], [Str], [Str]) splitDeps (x:xs) | x == BS.pack "|" = ([],a++b,c) | x == BS.pack "||" = ([],b,a++c) | otherwise = (x:a,b,c) where (a,b,c) = splitDeps xs splitDeps [] = ([], [], []) getDepth :: Env Str Str -> [(Str, Expr)] -> IO Int getDepth env xs = case lookup (BS.pack "depth") xs of Nothing -> return 1 Just x -> do x <- askExpr env x case BS.readInt x of Just (i, n) | BS.null n -> return i _ -> error $ "Could not parse depth field in pool, got: " ++ BS.unpack x shake-0.16.4/src/Development/Ninja/Lexer.hs0000644000000000000000000001761513261223301016611 0ustar0000000000000000{-# LANGUAGE PatternGuards #-} -- {-# OPTIONS_GHC -O2 #-} -- fails with GHC 7.10 -- {-# OPTIONS_GHC -ddump-simpl #-} -- | Lexing is a slow point, the code below is optimised module Development.Ninja.Lexer(Lexeme(..), lexerFile) where import Control.Applicative import Data.Tuple.Extra import Data.Char import qualified Data.ByteString.Char8 as BS import qualified Data.ByteString.Unsafe as BS import Development.Ninja.Type import qualified Data.ByteString.Internal as Internal import System.IO.Unsafe import Data.Word import Foreign.Ptr import Foreign.Storable import GHC.Exts import Prelude --------------------------------------------------------------------- -- LIBRARY BITS newtype Str0 = Str0 Str -- null terminated type S = Ptr Word8 char :: S -> Char char x = Internal.w2c $ unsafePerformIO $ peek x next :: S -> S next x = x `plusPtr` 1 {-# INLINE dropWhile0 #-} dropWhile0 :: (Char -> Bool) -> Str0 -> Str0 dropWhile0 f x = snd $ span0 f x {-# INLINE span0 #-} span0 :: (Char -> Bool) -> Str0 -> (Str, Str0) span0 f = break0 (not . f) {-# INLINE break0 #-} break0 :: (Char -> Bool) -> Str0 -> (Str, Str0) break0 f (Str0 bs) = (BS.unsafeTake i bs, Str0 $ BS.unsafeDrop i bs) where i = unsafePerformIO $ BS.unsafeUseAsCString bs $ \ptr -> do let start = castPtr ptr :: S let end = go start return $! Ptr end `minusPtr` start go s@(Ptr a) | c == '\0' || f c = a | otherwise = go (next s) where c = char s {-# INLINE break00 #-} -- The predicate must return true for '\0' break00 :: (Char -> Bool) -> Str0 -> (Str, Str0) break00 f (Str0 bs) = (BS.unsafeTake i bs, Str0 $ BS.unsafeDrop i bs) where i = unsafePerformIO $ BS.unsafeUseAsCString bs $ \ptr -> do let start = castPtr ptr :: S let end = go start return $! Ptr end `minusPtr` start go s@(Ptr a) | f c = a | otherwise = go (next s) where c = char s head0 :: Str0 -> Char head0 (Str0 x) = Internal.w2c $ BS.unsafeHead x tail0 :: Str0 -> Str0 tail0 (Str0 x) = Str0 $ BS.unsafeTail x list0 :: Str0 -> (Char, Str0) list0 x = (head0 x, tail0 x) take0 :: Int -> Str0 -> Str take0 i (Str0 x) = BS.takeWhile (/= '\0') $ BS.take i x --------------------------------------------------------------------- -- ACTUAL LEXER -- Lex each line separately, rather than each lexeme data Lexeme = LexBind Str Expr -- [indent]foo = bar | LexBuild [Expr] Str [Expr] -- build foo: bar | baz || qux (| and || are represented as Expr) | LexInclude Expr -- include file | LexSubninja Expr -- include file | LexRule Str -- rule name | LexPool Str -- pool name | LexDefault [Expr] -- default foo bar | LexDefine Str Expr -- foo = bar deriving Show isVar, isVarDot :: Char -> Bool isVar x = x == '-' || x == '_' || isAsciiLower x || isAsciiUpper x || isDigit x isVarDot x = x == '.' || isVar x endsDollar :: Str -> Bool endsDollar = BS.isSuffixOf (BS.singleton '$') dropN :: Str0 -> Str0 dropN x = if head0 x == '\n' then tail0 x else x dropSpace :: Str0 -> Str0 dropSpace = dropWhile0 (== ' ') lexerFile :: Maybe FilePath -> IO [Lexeme] lexerFile file = lexer <$> maybe BS.getContents BS.readFile file lexer :: Str -> [Lexeme] lexer x = lexerLoop $ Str0 $ x `BS.append` BS.pack "\n\n\0" lexerLoop :: Str0 -> [Lexeme] lexerLoop c_x | (c,x) <- list0 c_x = case c of '\r' -> lexerLoop x '\n' -> lexerLoop x ' ' -> lexBind $ dropSpace x '#' -> lexerLoop $ dropWhile0 (/= '\n') x 'b' | Just x <- strip "uild " x -> lexBuild x 'r' | Just x <- strip "ule " x -> lexRule x 'd' | Just x <- strip "efault " x -> lexDefault x 'p' | Just x <- strip "ool " x -> lexPool x 'i' | Just x <- strip "nclude " x -> lexInclude x 's' | Just x <- strip "ubninja " x -> lexSubninja x '\0' -> [] _ -> lexDefine c_x where strip str (Str0 x) = if b `BS.isPrefixOf` x then Just $ dropSpace $ Str0 $ BS.drop (BS.length b) x else Nothing where b = BS.pack str lexBind :: Str0 -> [Lexeme] lexBind c_x | (c,x) <- list0 c_x = case c of '\r' -> lexerLoop x '\n' -> lexerLoop x '#' -> lexerLoop $ dropWhile0 (/= '\n') x '\0' -> [] _ -> lexxBind LexBind c_x lexBuild :: Str0 -> [Lexeme] lexBuild x | (outputs,x) <- lexxExprs True x , (rule,x) <- span0 isVarDot $ jumpCont $ dropSpace x , (deps,x) <- lexxExprs False $ dropSpace x = LexBuild outputs rule deps : lexerLoop x lexDefault :: Str0 -> [Lexeme] lexDefault x | (files,x) <- lexxExprs False x = LexDefault files : lexerLoop x lexRule, lexPool, lexInclude, lexSubninja, lexDefine :: Str0 -> [Lexeme] lexRule = lexxName LexRule lexPool = lexxName LexPool lexInclude = lexxFile LexInclude lexSubninja = lexxFile LexSubninja lexDefine = lexxBind LexDefine lexxBind :: (Str -> Expr -> Lexeme) -> Str0 -> [Lexeme] lexxBind ctor x | (var,x) <- span0 isVarDot x , ('=',x) <- list0 $ jumpCont $ dropSpace x , (exp,x) <- lexxExpr False False $ jumpCont $ dropSpace x = ctor var exp : lexerLoop x lexxBind _ x = error $ show ("parse failed when parsing binding", take0 100 x) lexxFile :: (Expr -> Lexeme) -> Str0 -> [Lexeme] lexxFile ctor x | (exp,rest) <- lexxExpr False False $ dropSpace x = ctor exp : lexerLoop rest lexxName :: (Str -> Lexeme) -> Str0 -> [Lexeme] lexxName ctor x | (name,rest) <- splitLineCont x = ctor name : lexerLoop rest lexxExprs :: Bool -> Str0 -> ([Expr], Str0) lexxExprs stopColon x = case lexxExpr stopColon True x of (a,c_x) | c <- head0 c_x, x <- tail0 c_x -> case c of ' ' -> add a $ lexxExprs stopColon $ dropSpace x ':' | stopColon -> new a x _ | stopColon -> error "expected a colon" '\r' -> new a $ dropN x '\n' -> new a x '\0' -> new a c_x where new a x = add a ([], x) add (Exprs []) x = x add a (as,x) = (a:as,x) {-# NOINLINE lexxExpr #-} lexxExpr :: Bool -> Bool -> Str0 -> (Expr, Str0) -- snd will start with one of " :\n\r" or be empty lexxExpr stopColon stopSpace = first exprs . f where exprs [x] = x exprs xs = Exprs xs special = case (stopColon, stopSpace) of (True , True ) -> \x -> x <= ':' && (x == ':' || x == ' ' || x == '$' || x == '\r' || x == '\n' || x == '\0') (True , False) -> \x -> x <= ':' && (x == ':' || x == '$' || x == '\r' || x == '\n' || x == '\0') (False, True ) -> \x -> x <= '$' && ( x == ' ' || x == '$' || x == '\r' || x == '\n' || x == '\0') (False, False) -> \x -> x <= '$' && ( x == '$' || x == '\r' || x == '\n' || x == '\0') f x = case break00 special x of (a,x) -> if BS.null a then g x else Lit a $: g x x $: (xs,y) = (x:xs,y) g x | head0 x /= '$' = ([], x) g x | c_x <- tail0 x, (c,x) <- list0 c_x = case c of '$' -> Lit (BS.singleton '$') $: f x ' ' -> Lit (BS.singleton ' ') $: f x ':' -> Lit (BS.singleton ':') $: f x '\n' -> f $ dropSpace x '\r' -> f $ dropSpace $ dropN x '{' | (name,x) <- span0 isVarDot x, not $ BS.null name, ('}',x) <- list0 x -> Var name $: f x _ | (name,x) <- span0 isVar c_x, not $ BS.null name -> Var name $: f x _ -> error "Unexpect $ followed by unexpected stuff" jumpCont :: Str0 -> Str0 jumpCont o | '$' <- head0 o , let x = tail0 o = case head0 x of '\n' -> dropSpace $ tail0 x '\r' -> dropSpace $ dropN $ tail0 x _ -> o | otherwise = o splitLineCont :: Str0 -> (Str, Str0) splitLineCont x = first BS.concat $ f x where f x = if not $ endsDollar a then ([a], b) else let (c,d) = f $ dropSpace b in (BS.init a : c, d) where (a,b) = splitLineCR x splitLineCR :: Str0 -> (Str, Str0) splitLineCR x = if BS.singleton '\r' `BS.isSuffixOf` a then (BS.init a, dropN b) else (a, dropN b) where (a,b) = break0 (== '\n') x shake-0.16.4/src/Development/Ninja/Env.hs0000644000000000000000000000202213261223301016244 0ustar0000000000000000{-# LANGUAGE PatternGuards #-} -- | A Ninja style environment, basically a linked-list of mutable hash tables. module Development.Ninja.Env( Env, newEnv, scopeEnv, addEnv, askEnv, fromEnv ) where import qualified Data.HashMap.Strict as Map import Data.Hashable import Data.IORef data Env k v = Env (IORef (Map.HashMap k v)) (Maybe (Env k v)) instance Show (Env k v) where show _ = "Env" newEnv :: IO (Env k v) newEnv = do ref <- newIORef Map.empty; return $ Env ref Nothing scopeEnv :: Env k v -> IO (Env k v) scopeEnv e = do ref <- newIORef Map.empty; return $ Env ref $ Just e addEnv :: (Eq k, Hashable k) => Env k v -> k -> v -> IO () addEnv (Env ref _) k v = modifyIORef ref $ Map.insert k v askEnv :: (Eq k, Hashable k) => Env k v -> k -> IO (Maybe v) askEnv (Env ref e) k = do mp <- readIORef ref case Map.lookup k mp of Just v -> return $ Just v Nothing | Just e <- e -> askEnv e k _ -> return Nothing fromEnv :: Env k v -> IO (Map.HashMap k v) fromEnv (Env ref _) = readIORef ref shake-0.16.4/src/Development/Ninja/All.hs0000644000000000000000000003247513261223301016243 0ustar0000000000000000{-# LANGUAGE RecordWildCards, ViewPatterns, ScopedTypeVariables #-} module Development.Ninja.All(runNinja) where import Development.Ninja.Env import Development.Ninja.Type import Development.Ninja.Parse import Development.Shake hiding (addEnv) import qualified Data.ByteString as BS8 import qualified Data.ByteString.Char8 as BS import System.Directory import qualified Data.HashMap.Strict as Map import qualified Data.HashSet as Set import Data.Tuple.Extra import Control.Applicative import Control.Exception.Extra import Control.Monad import Data.Maybe import Data.Char import Data.List.Extra import System.Info.Extra import Prelude -- Internal imports import General.Extra(removeFile_) import General.Timing(addTiming) import General.Makefile(parseMakefile) import Development.Shake.Internal.FileName(filepathNormalise, fileNameFromString) import Development.Shake.Internal.FileInfo(getFileInfo) import Development.Shake.Internal.Errors(errorStructured) import Development.Shake.Internal.Rules.File(needBS, neededBS) import Development.Shake.Internal.Rules.OrderOnly(orderOnlyBS) -- | Given the Ninja source file, a list of file arguments, a tool name. -- Return a bool if you should restart and the rules. runNinja :: IO () -> FilePath -> [String] -> Maybe String -> IO (Maybe (Rules ())) runNinja restart file args (Just "compdb") = do dir <- getCurrentDirectory Ninja{..} <- parse file =<< newEnv rules <- return $ Map.fromList [r | r <- rules, BS.unpack (fst r) `elem` args] -- the build items are generated in reverse order, hence the reverse let xs = [(a,b,file,rule) | (a,b@Build{..}) <- reverse $ multiples ++ map (first return) singles , Just rule <- [Map.lookup ruleName rules], file:_ <- [depsNormal]] xs <- forM xs $ \(out,Build{..},file,Rule{..}) -> do -- the order of adding new environment variables matters env <- scopeEnv env addEnv env (BS.pack "out") (BS.unwords $ map quote out) addEnv env (BS.pack "in") (BS.unwords $ map quote depsNormal) addEnv env (BS.pack "in_newline") (BS.unlines depsNormal) forM_ buildBind $ \(a,b) -> addEnv env a b addBinds env ruleBind commandline <- fmap BS.unpack $ askVar env $ BS.pack "command" return $ CompDb dir commandline $ BS.unpack $ head depsNormal putStr $ printCompDb xs return Nothing runNinja restart file args (Just x) = errorIO $ "Unknown tool argument, expected 'compdb', got " ++ x runNinja restart file args tool = do addTiming "Ninja parse" ninja@Ninja{..} <- parse file =<< newEnv return $ Just $ do phonys <- return $ Map.fromList phonys needDeps <- return $ needDeps ninja phonys -- partial application singles <- return $ Map.fromList $ map (first filepathNormalise) singles multiples <- return $ Map.fromList [(x,(xs,b)) | (xs,b) <- map (first $ map filepathNormalise) multiples, x <- xs] rules <- return $ Map.fromList rules pools <- fmap Map.fromList $ forM ((BS.pack "console",1):pools) $ \(name,depth) -> (,) name <$> newResource (BS.unpack name) depth action $ do -- build the .ninja files, if they change, restart the build before <- liftIO $ mapM (getFileInfo . fileNameFromString) sources need sources after <- liftIO $ mapM (getFileInfo . fileNameFromString) sources if before /= after then runAfter restart else needBS $ concatMap (resolvePhony phonys) $ if not $ null args then map BS.pack args else if not $ null defaults then defaults else Map.keys singles ++ Map.keys multiples (\x -> map BS.unpack . fst <$> Map.lookup (BS.pack x) multiples) &?> \out -> let out2 = map BS.pack out in build needDeps phonys rules pools out2 $ snd $ multiples Map.! head out2 (flip Map.member singles . BS.pack) ?> \out -> let out2 = BS.pack out in build needDeps phonys rules pools [out2] $ singles Map.! out2 resolvePhony :: Map.HashMap Str [Str] -> Str -> [Str] resolvePhony mp = f $ Left 100 where f (Left 0) x = f (Right []) x f (Right xs) x | x `elem` xs = error $ "Recursive phony involving " ++ BS.unpack x f a x = case Map.lookup x mp of Nothing -> [x] Just xs -> concatMap (f $ either (Left . subtract 1) (Right . (x:)) a) xs quote :: Str -> Str quote x | BS.any isSpace x = let q = BS.singleton '\"' in BS.concat [q,x,q] | otherwise = x build :: (Build -> [Str] -> Action ()) -> Map.HashMap Str [Str] -> Map.HashMap Str Rule -> Map.HashMap Str Resource -> [Str] -> Build -> Action () build needDeps phonys rules pools out build@Build{..} = do needBS $ concatMap (resolvePhony phonys) $ depsNormal ++ depsImplicit orderOnlyBS $ concatMap (resolvePhony phonys) depsOrderOnly case Map.lookup ruleName rules of Nothing -> liftIO $ errorIO $ "Ninja rule named " ++ BS.unpack ruleName ++ " is missing, required to build " ++ BS.unpack (BS.unwords out) Just Rule{..} -> do env <- liftIO $ scopeEnv env liftIO $ do -- the order of adding new environment variables matters addEnv env (BS.pack "out") (BS.unwords $ map quote out) addEnv env (BS.pack "in") (BS.unwords $ map quote depsNormal) addEnv env (BS.pack "in_newline") (BS.unlines depsNormal) forM_ buildBind $ \(a,b) -> addEnv env a b addBinds env ruleBind applyRspfile env $ do commandline <- liftIO $ fmap BS.unpack $ askVar env $ BS.pack "command" depfile <- liftIO $ fmap BS.unpack $ askVar env $ BS.pack "depfile" deps <- liftIO $ fmap BS.unpack $ askVar env $ BS.pack "deps" description <- liftIO $ fmap BS.unpack $ askVar env $ BS.pack "description" pool <- liftIO $ askVar env $ BS.pack "pool" let withPool act = case Map.lookup pool pools of _ | BS.null pool -> act Nothing -> liftIO $ errorIO $ "Ninja pool named " ++ BS.unpack pool ++ " not found, required to build " ++ BS.unpack (BS.unwords out) Just r -> withResource r 1 act when (description /= "") $ putNormal description let (cmdOpts, cmdProg, cmdArgs) = toCommand commandline if deps == "msvc" then do Stdout stdout <- withPool $ command cmdOpts cmdProg cmdArgs prefix <- liftIO $ fmap (fromMaybe $ BS.pack "Note: including file: ") $ askEnv env $ BS.pack "msvc_deps_prefix" needDeps build $ parseShowIncludes prefix $ BS.pack stdout else withPool $ command_ cmdOpts cmdProg cmdArgs when (depfile /= "") $ do when (deps /= "gcc") $ need [depfile] depsrc <- liftIO $ BS.readFile depfile needDeps build $ concatMap snd $ parseMakefile depsrc -- correct as per the Ninja spec, but breaks --skip-commands -- when (deps == "gcc") $ liftIO $ removeFile depfile needDeps :: Ninja -> Map.HashMap Str [Str] -> Build -> [Str] -> Action () needDeps Ninja{..} phonysMp = \build xs -> do -- eta reduced so 'builds' is shared opts <- getShakeOptions if isNothing $ shakeLint opts then needBS xs else do neededBS xs -- now try and statically validate needed will never fail -- first find which dependencies are generated files xs <- return $ filter (`Map.member` builds) xs -- now try and find them as dependencies let bad = xs `difference` allDependencies build case bad of [] -> return () xs -> liftIO $ errorStructured ("Lint checking error - " ++ (if length xs == 1 then "file in deps is" else "files in deps are") ++ " generated and not a pre-dependency") [("File", Just $ BS.unpack x) | x <- xs] "" where builds :: Map.HashMap FileStr Build builds = Map.fromList $ singles ++ [(x,y) | (xs,y) <- multiples, x <- xs] -- do list difference, assuming a small initial set, most of which occurs early in the list difference :: [Str] -> [Str] -> [Str] difference [] ys = [] difference xs ys = f (Set.fromList xs) ys where f xs [] = Set.toList xs f xs (y:ys) | y `Set.member` xs = if Set.null xs2 then [] else f xs2 ys where xs2 = Set.delete y xs f xs (y:ys) = f xs ys -- find all dependencies of a rule, no duplicates, with all dependencies of this rule listed first allDependencies :: Build -> [FileStr] allDependencies rule = f Set.empty [] [rule] where f seen [] [] = [] f seen [] (x:xs) = f seen (map filepathNormalise $ concatMap (resolvePhony phonysMp) $ depsNormal x ++ depsImplicit x ++ depsOrderOnly x) xs f seen (x:xs) rest | x `Set.member` seen = f seen xs rest | otherwise = x : f (Set.insert x seen) xs (maybeToList (Map.lookup x builds) ++ rest) applyRspfile :: Env Str Str -> Action a -> Action a applyRspfile env act = do rspfile <- liftIO $ fmap BS.unpack $ askVar env $ BS.pack "rspfile" rspfile_content <- liftIO $ askVar env $ BS.pack "rspfile_content" if rspfile == "" then act else flip actionFinally (removeFile_ rspfile) $ do liftIO $ BS.writeFile rspfile rspfile_content act parseShowIncludes :: Str -> Str -> [FileStr] parseShowIncludes prefix out = [y | x <- BS.lines out, prefix `BS.isPrefixOf` x , let y = BS.dropWhile isSpace $ BS.drop (BS.length prefix) x , not $ isSystemInclude y] -- Dodgy, but ported over from the original Ninja isSystemInclude :: FileStr -> Bool isSystemInclude x = bsProgFiles `BS.isInfixOf` tx || bsVisStudio `BS.isInfixOf` tx where tx = BS8.map (\c -> if c >= 97 then c - 32 else c) x -- optimised toUpper that only cares about letters and spaces bsProgFiles = BS.pack "PROGRAM FILES" bsVisStudio = BS.pack "MICROSOFT VISUAL STUDIO" data CompDb = CompDb {cdbDirectory :: String ,cdbCommand :: String ,cdbFile :: String } deriving Show printCompDb :: [CompDb] -> String printCompDb xs = unlines $ ["["] ++ concat (zipWith f [1..] xs) ++ ["]"] where n = length xs f i CompDb{..} = [" {" ," \"directory\": " ++ g cdbDirectory ++ "," ," \"command\": " ++ g cdbCommand ++ "," ," \"file\": " ++ g cdbFile ," }" ++ (if i == n then "" else ",")] g = show toCommand :: String -> ([CmdOption], String, [String]) toCommand s -- On POSIX, Ninja does a /bin/sh -c, and so does Haskell in Shell mode (easy). | not isWindows = ([Shell], s, []) -- On Windows, Ninja passes the string directly to CreateProcess, -- but Haskell applies some escaping first. -- We try and get back as close to the original as we can, but it's very hacky | length s < 8000 = -- Using the "cmd" program adds overhead (I measure 7ms), and a limit of 8191 characters, -- but is the most robust, requiring no additional escaping. ([Shell], s, []) | (cmd,s) <- word1 s, map toUpper cmd `elem` ["CMD","CMD.EXE"], ("/c",s) <- word1 s = -- Given "cmd.exe /c " we translate to Shell, which adds cmd.exe -- (looked up on the current path) and /c to the front. CMake uses this rule a lot. -- Adding quotes around pieces are /c goes very wrong. ([Shell], s, []) | otherwise = -- It's a long command line which doesn't call "cmd /c". We reverse the escaping -- Haskell applies, but each argument will still gain quotes around it. let xs = splitArgs s in ([], head $ xs ++ [""], drop 1 xs) data State = Gap -- ^ Current in the gap between words | Word -- ^ Currently inside a space-separated argument | Quot -- ^ Currently inside a quote-surrounded argument -- | The process package contains a translate function, reproduced below. The aim is that after command line -- parsing we should get out mostly the same answer. splitArgs :: String -> [String] splitArgs = f Gap where f Gap (x:xs) | isSpace x = f Gap xs f Gap ('\"':xs) = f Quot xs f Gap [] = [] f Gap xs = f Word xs f Word (x:xs) | isSpace x = [] : f Gap xs f Quot ('\"':xs) = [] : f Gap xs f s ('\\':xs) | (length -> a, b) <- span (== '\\') xs = case b of '\"':xs | even a -> add (replicate (a `div` 2) '\\' ++ "\"") $ f s xs | otherwise -> add (replicate ((a+1) `div` 2) '\\') $ f s ('\"':xs) xs -> add (replicate (a+1) '\\') $ f s xs f s (x:xs) = add [x] $ f s xs f s [] = [[]] add a (b:c) = (a++b):c add a [] = [a] {- translate (cmd,args) = unwords $ f cmd : map f args where f x = '"' : snd (foldr escape (True,"\"") xs) escape '"' (_, str) = (True, '\\' : '"' : str) escape '\\' (True, str) = (True, '\\' : '\\' : str) escape c (_, str) = (False, c : str) -} shake-0.16.4/html/0000755000000000000000000000000013261223301012020 5ustar0000000000000000shake-0.16.4/html/shake.js0000644000000000000000000010111113261223301013444 0ustar0000000000000000// GENERATED CODE - DO NOT MODIFY // SOURCE IS IN THE ts/ DIRECTORY "use strict"; var Summary = (function () { function Summary() { this.count = 0; this.countLast = 0; this.highestRun = 0; this.sumExecution = 0; this.maxExecution = 0; this.maxExecutionName = ""; this.countTrace = 0; this.countTraceLast = 0; this.sumTrace = 0; this.sumTraceLast = 0; this.maxTrace = 0; this.maxTraceName = ""; this.maxTraceStopLast = 0; } return Summary; })(); function summary(dat) { var res = new Summary(); res.count = dat.length; for (var _i = 0; _i < dat.length; _i++) { var e = dat[_i]; var isLast = e.built === 0; res.countLast += isLast ? 1 : 0; res.sumExecution += e.execution; res.maxExecution = Math.max(res.maxExecution, e.execution); if (res.maxExecution === e.execution) res.maxExecutionName = e.name; res.highestRun = Math.max(res.highestRun, e.changed); var traces = e.traces; if (!traces) continue; for (var _a = 0; _a < traces.length; _a++) { var t_1 = traces[_a]; var time = t_1.stop - t_1.start; res.countTrace += 1; res.countTraceLast += isLast ? 1 : 0; res.sumTrace += time; res.sumTraceLast += isLast ? time : 0; res.maxTrace = Math.max(res.maxTrace, time); if (res.maxTrace == time) res.maxTraceName = t_1.command; res.maxTraceStopLast = Math.max(res.maxTraceStopLast, isLast ? t_1.stop : 0); } } return res; } function showSummary(sum) { return ["This database has tracked " + (sum.highestRun + 1) + " run" + plural(sum.highestRun + 1) + ".", "There are " + sum.count + " rules (" + sum.countLast + " rebuilt in the last run).", "Building required " + sum.countTrace + " traced commands (" + sum.countTraceLast + " in the last run).", "The total (unparallelised) build time is " + showTime(sum.sumExecution) + " of which " + showTime(sum.sumTrace) + " is traced commands.", "The longest rule takes " + showTime(sum.maxExecution) + " (" + sum.maxExecutionName + ") and the longest traced command takes " + showTime(sum.maxTrace) + " (" + sum.maxTraceName + ").", "Last run gave an average parallelism of " + (sum.maxTraceStopLast === 0 ? 0 : sum.sumTraceLast / sum.maxTraceStopLast).toFixed(2) + " times over " + showTime(sum.maxTraceStopLast) + "." ]; } var Prepare = (function () { function Prepare() { } return Prepare; })(); function addRdeps(dat) { var rdeps = []; for (var i = 0; i < dat.length; i++) rdeps[i] = {}; for (var i = 0; i < dat.length; i++) { for (var _i = 0, _a = dat[i].depends; _i < _a.length; _i++) { var j = _a[_i]; rdeps[j][i] = null; } } var res = dat; for (var i = 0; i < rdeps.length; i++) { var ans = []; for (var j in rdeps[i]) ans.push(Number(j)); res[i].rdeps = ans; } return res; } function calcRebuildCosts(dat, xs) { var seen = {}; var tot = 0; function f(i) { if (i in seen) return; seen[i] = null; tot += dat[i].execution; for (var _i = 0, _a = dat[i].rdeps; _i < _a.length; _i++) { var j = _a[_i]; f(j); } } if (xs.length === 1 && dat[xs[0]].depends.length === 1) tot = dat[dat[xs[0]].depends[0]].cost + dat[xs[0]].execution; else { for (var _i = 0; _i < xs.length; _i++) { var x = xs[_i]; f(x); } } return tot; } function addCost(dat) { var res = dat; for (var i = 0; i < dat.length; i++) { res[i].cost = calcRebuildCosts(res, [i]); } return res; } function prepare(dat_) { var sum = summary(dat_); var dat = addCost(addRdeps(dat_)); function toHash(r) { return typeof r === "string" ? "$" + r : "/" + r.source; } function findDirect(key) { var c = cache(toHash, function (r) { var want = {}; for (var _i = 0; _i < dat.length; _i++) { var e = dat[_i]; if (testRegExp(r, e.name)) { var deps = (e)[key]; for (var _a = 0; _a < deps.length; _a++) { var j = deps[_a]; want[j] = null; } } } return want; }); return function (i, r) { return i in c(r); }; } function findTransitive(key, dirFwd) { var c = cache(toHash, function (r) { var want = {}; for (var i = 0; i < dat.length; i++) { var j = dirFwd ? i : dat.length - 1 - i; if ((j in want) || testRegExp(r, dat[j].name)) { want[j] = null; var deps = (dat[j])[key]; for (var _i = 0; _i < deps.length; _i++) { var k = deps[_i]; want[k] = null; } } } return want; }); return function (i, r) { return i in c(r); }; } return { original: dat, summary: sum, dependsOnThis: findDirect("rdeps"), thisDependsOn: findDirect("depends"), dependsOnThisTransitive: findTransitive("depends", false), thisDependsOnTransitive: findTransitive("rdeps", true) }; } function colorAnd(c1, c2) { return c1 === null ? c2 : c1 === c2 ? c1 : undefined; } var Result = (function () { function Result() { } return Result; })(); function ruleFilter(dat, query) { queryData = dat; var f = readQuery(query); var res = {}; for (queryKey = 0; queryKey < dat.original.length; queryKey++) { queryVal = dat.original[queryKey]; queryName = queryVal.name; queryGroup = null; queryBackColor = null; queryTextColor = null; if (f()) { if (queryGroup === null) queryGroup = queryName; if (!(queryGroup in res)) res[queryGroup] = { items: [queryKey], text: queryTextColor, back: queryBackColor }; else { var c = res[queryGroup]; c.items.push(queryKey); c.text = colorAnd(c.text, queryTextColor); c.back = colorAnd(c.back, queryBackColor); } } } return res; } var ResultTable = (function () { function ResultTable() { } return ResultTable; })(); function ruleTable(dat, query) { function bools(x, y) { return x === "" ? y : x === y ? x : "both"; } var res = ruleFilter(dat, query); var ans = []; for (var s in res) { var xs = res[s].items; var time = 0; var leaf = ""; var unchanged = ""; var run = 100000; for (var i = 0; i < xs.length; i++) { var x = dat.original[xs[i]]; time += x.execution; leaf = bools(leaf, x.depends.length === 0); unchanged = bools(unchanged, x.changed !== x.built); run = Math.min(run, x.built); } ans.push({ name: s, count: xs.length, time: time, back: res[s].back, text: res[s].text, cost: calcRebuildCosts(dat.original, xs), leaf: leaf, run: run, unchanged: unchanged }); } return ans; } var ResultGraph = (function () { function ResultGraph() { } return ResultGraph; })(); function ruleGraph(dat, query) { var res = ruleFilter(dat, query); var map = {}; var direct = {}; var ind = -1; for (var s in res) { ind++; var xs = res[s].items; for (var i = 0; i < xs.length; i++) direct[xs[i]] = ind; } function getDirect(key) { return key in direct ? [direct[key]] : []; } var indirect = {}; function getIndirect(key) { if (key in indirect) return indirect[key]; if (key in direct) return []; var ds = dat.original[key].depends; var res = []; for (var j = 0; j < ds.length; j++) { res.push(getIndirect(ds[j])); res.push(getDirect(ds[j])); } var res2 = concatNub(res); indirect[key] = res2; return res2; } var ans = []; for (var s in res) { var xs = res[s].items; var ds = []; var is = []; for (var i = 0; i < xs.length; i++) { var depends = dat.original[xs[i]].depends; for (var j = 0; j < depends.length; j++) { ds.push(getDirect(depends[j])); is.push(getIndirect(depends[j])); } } ans.push({ name: s, text: res[s].text, back: res[s].back, parents: concatNub(ds), ancestors: concatNub(is) }); } return ans; } function commandFilter(last, dat, query) { queryData = dat; var f = readQuery(query); var res = {}; for (queryKey = 0; queryKey < dat.original.length; queryKey++) { queryVal = dat.original[queryKey]; if (last && queryVal.built !== 0) continue; var val = recordCopy(queryVal); var ts = queryVal.traces || []; queryVal = val; queryName = queryVal.name; queryBackColor = null; queryTextColor = null; for (var i = 0; i < ts.length; i++) { queryVal.traces = [ts[i]]; queryGroup = null; if (f()) { if (queryGroup === null) queryGroup = ts[i].command; if (!(queryGroup in res)) res[queryGroup] = { items: [ts[i]], text: queryTextColor, back: queryBackColor }; else { var c = res[queryGroup]; c.items.push(ts[i]); c.text = colorAnd(c.text, queryTextColor); c.back = colorAnd(c.back, queryBackColor); } } } } return res; } var CommandTable = (function () { function CommandTable() { } return CommandTable; })(); function commandTable(dat, query) { var res = commandFilter(false, dat, query); var ans = []; for (var s in res) { var xs = res[s].items; var time = 0; for (var _i = 0; _i < xs.length; _i++) { var t_2 = xs[_i]; time += t_2.stop - t_2.start; } ans.push({ name: s, count: xs.length, text: res[s].text, back: res[s].back, time: time }); } return ans; } function commandPlot(dat, query, buckets) { var end = dat.summary.maxTraceStopLast; var res = commandFilter(true, dat, query); var ans = {}; for (var s in res) { var ts = res[s].items; var xs = []; for (var i = 0; i <= buckets; i++) xs.push(0); for (var _i = 0; _i < ts.length; _i++) { var t_3 = ts[_i]; var start = t_3.start * buckets / end; var stop = t_3.stop * buckets / end; if (Math.floor(start) === Math.floor(stop)) xs[Math.floor(start)] += stop - start; else { for (var j = Math.ceil(start); j < Math.floor(stop); j++) xs[j]++; xs[Math.floor(start)] += Math.ceil(start) - start; xs[Math.floor(stop)] += stop - Math.floor(stop); } } ans[s] = { items: xs.slice(0, buckets), back: res[s].back || null }; } return ans; } function readQuery(query) { if (query === "") return function () { return true; }; var f; try { f = (new Function("return " + query)); } catch (e) { throw { user: true, name: "parse", query: query, message: e.toString() }; } return function () { try { return f(); } catch (e) { throw { user: true, name: "execution", query: query, message: e.toString() }; } }; } var queryData = {}; var queryKey = 0; var queryVal = {}; var queryName = ""; var queryGroup = null; var queryBackColor = null; var queryTextColor = null; function childOf(r) { return queryData.dependsOnThis(queryKey, r); } function parentOf(r) { return queryData.thisDependsOn(queryKey, r); } function ancestorOf(r) { return queryData.dependsOnThisTransitive(queryKey, r); } function descendantOf(r) { return queryData.thisDependsOnTransitive(queryKey, r); } function descendentOf(r) { return descendantOf(r); } function group(x) { if (queryGroup === null) queryGroup = ""; queryGroup += (queryGroup === "" ? "" : " ") + x; return true; } function backColor(c, b) { if (b === void 0) { b = true; } if (b) queryBackColor = c; return true; } function textColor(c, b) { if (b === void 0) { b = true; } if (b === undefined || b) queryTextColor = c; return true; } function rename(from, to) { if (to === void 0) { to = ""; } queryName = queryName.replace(from, to); return true; } function slowestRule() { return queryData.summary.maxExecutionName; } function leaf() { return queryVal.depends.length === 0; } function run(i) { if (i === undefined) return queryVal.built; else return queryVal.built === i; } function unchanged() { return queryVal.changed !== queryVal.built; } function named(r, groupName) { if (r === undefined) return queryName; var res = execRegExp(r, queryName); if (res === null) { if (groupName === undefined) return false; else { group(groupName); return true; } } if (res.length !== 1) { for (var i = 1; i < res.length; i++) group(res[i]); } return true; } function command(r, groupName) { var n = (queryVal.traces || []).length; if (r === undefined) return n === 0 ? "" : queryVal.traces[0].command; for (var _i = 0, _a = queryVal.traces; _i < _a.length; _i++) { var t_4 = _a[_i]; var res = execRegExp(r, t_4.command); if (res === null) continue; if (res.length !== 1) { for (var j = 1; j < res.length; j++) group(res[j]); } return true; } if (groupName === undefined) return false; else { group(groupName); return true; } } "use strict"; function initProgress() { $(function () { $(".version").html("Generated by Shake " + version + "."); $("#output").html(""); for (var _i = 0; _i < progress.length; _i++) { var x = progress[_i]; var actual = []; var ideal = []; for (var t = 5; t < x.values.length; t++) { var y = x.values[t]; actual.push([y.idealSecs, y.actualSecs]); ideal.push([y.idealSecs, y.idealSecs]); } var ys = [{ data: ideal, color: "gray" }, { label: x.name, data: actual, color: "red" }]; var div = $("
"); $("#output").append(div); $.plot(div, ys, { xaxis: { transform: function (v) { return -v; }, inverseTransform: function (v) { return -v; } } }); } }); } "use strict"; function t(a, b, c) { return { start: a, stop: b, command: c }; } var dat1 = [{ name: "Functional", built: 0, changed: 3, depends: [], execution: 1, traces: [t(0, 1, "gen")] }, { name: "Imperative", built: 0, changed: 0, depends: [], execution: 2, traces: [t(0, 1, "gen"), t(1, 2, "gen")] }, { name: "HsSource", built: 3, changed: 3, depends: [], execution: 0 }, { name: "Haskell", built: 3, changed: 3, depends: [0, 2], execution: 8, traces: [t(1, 8.9, "ghc")] }, { name: "C", built: 0, changed: 0, depends: [1], execution: 15, traces: [t(2, 16.9, "gcc")] }, { name: "Cpp", built: 0, changed: 0, depends: [1], execution: 10, traces: [t(2, 10, "gcc")] }, { name: "Exe", built: 0, changed: 0, depends: [3, 4, 5], execution: 5, traces: [t(17, 22, "link")] } ]; function test() { function assert(b) { if (!b) throw "Assertion failed"; } function assertEq(got, want) { if (want != got) { console.log("Wanted: " + want); console.log("Got: " + got); assert(false); } } function assertRegex(want, got) { if (!want.test(got)) { console.log("Wanted: " + want); console.log("Got: " + got); assert(false); } } var tab1 = prepare(dat1); var ssum1 = showSummary(tab1.summary); console.log(ssum1); var want = ["4 runs", "7 rules", "5 rebuilt", "7 traced", "6 in", "build time is 41.00s", "38.80s is traced", "longest rule takes 15.00s", "longest traced command takes 14.90s", "parallelism of 1.40", "22.00s"]; assertRegex(new RegExp(want.join(".*")), ssum1.join(" ")); var par1 = commandPlot(tab1, "group('x')", 10)['x']; console.log(par1); var pars1 = par1.items.map(function (i) { return Math.round(i * 10) / 10; }); assert(listEq(pars1, [1.5, 2, 2, 2, 1.5, 1, 1, 1, 1, 1])); function chk(f, query, n) { var ans = f(tab1, query); console_table(ans); assertEq(ans.length, n); } chk(ruleTable, "", 7); chk(ruleTable, "leaf()", 3); chk(ruleTable, "named(/^(.)/)", 5); chk(commandTable, "", 4); chk(commandTable, "command(/g(.*)/)", 3); chk(ruleTable, "childOf('Imperative')", 2); return "passed"; } function console_table(xs) { if ("table" in console) console["table"](xs); else if (xs.length === 0) console.log("No data"); else { var widths = []; var cells = []; for (var i_1 = 0; i_1 <= xs.length; i_1++) cells.push([]); for (var s_1 in xs[0]) { var len = s_1.length; cells[0].push(s_1); for (var i = 0; i < xs.length; i++) { var ss = "" + xs[i][s_1]; len = Math.max(len, ss.length); cells[i + 1].push(ss); } widths.push(len); } var s = ""; for (var x = 0; x < cells.length; x++) { for (var y = 0; y < widths.length; y++) s += "|" + pad(widths[y], cells[x][y]); s += "|\n"; } console.log(s); } } function pad(n, s) { var res = s; for (var i = s.length; i < n; i++) res += " "; return res; } "use strict"; var prepared = prepare(profile); var currentTable = null; var Report = (function () { function Report(mode_, query_) { if (mode_ === void 0) { mode_ = "summary"; } if (query_ === void 0) { query_ = ""; } this.sort = "time"; this.sortRev = false; this.mode = mode_; this.query = query_; } return Report; })(); var report = new Report(null, null); function reportEq(r1, r2) { return r1.mode === r2.mode && r1.query === r2.query && r1.sort === r2.sort && r1.sortRev === r2.sortRev; } function reportToURL(r) { var def = new Report(); return "?mode=" + r.mode + (r.query === def.query ? "" : "&query=" + encodeURI(r.query).replace(/\+/g, "%2B")) + ((!r.sortRev && r.sort === def.sort) ? "" : "&sort=" + (r.sortRev ? "!" : "") + r.sort); } function reportFromURL(s) { if (s === void 0) { s = window.location.search; } var res = new Report(); var params = uriQueryParameters(s); if ("mode" in params) res.mode = params["mode"]; if ("query" in params) res.query = params["query"]; if ("sort" in params) { var sort = params["sort"]; res.sortRev = sort.substr(0, 1) == "!"; res.sort = sort.substr(res.sortRev ? 1 : 0); } return res; } function reportFromUser() { return new Report($("#mode").val(), $("#query").val()); } function setReport(set, replace, run) { var report2 = set(recordCopy(report)); $("#mode").val(report2.mode); $("#query").val(report2.query); $("#run").enable(false).attr("title", "The current query is displayed"); if (reportEq(report, report2)) return; report = report2; if (window.history) { var title = report.mode + (report.query === "" ? "" : ": " + report.query); var url = reportToURL(report); try { if (replace) window.history.replaceState(report, title, url); else window.history.pushState(report, title, url); } catch (e) { } } $("#link").attr("href", reportToURL(report)); if (run) runReport(); } var rightAlign = { count: null, time: null, cost: null, run: null, leaf: null, unchanged: null }; var twoColumns = { cost: null, time: null }; var defaultRevSort = { run: null, name: null }; function tableSort(x) { if (report.sort === x) setReport(function (r) { r.sortRev = !r.sortRev; return r; }, true, false); else setReport(function (r) { r.sort = x; r.sortRev = x in defaultRevSort; return r; }, true, false); showTable(currentTable); } function showTable(xs) { currentTable = xs; if (xs.length === 0) { $("#output").html("No data found"); return; } if (!(report.sort in xs[0])) setReport(function (r) { return new Report(r.mode, r.query); }, true, false); xs.sort(function (a, b) { return (report.sortRev ? -1 : 1) * (b[report.sort] > a[report.sort] ? 1 : -1); }); var res = ""; for (var s in xs[0]) { if (s === "back" || s === "text") continue; res += s in twoColumns ? ""; } res += ""; for (var _i = 0; _i < xs.length; _i++) { var x = xs[_i]; res += ""; if (s === "count") res += x[s] + " ×"; else if (s === "time" || s === "cost") res += showTime(x[s]) + ""; } res += ""; } res += "
" + s; if (s === report.sort) res += " " + (report.sortRev ? "▲" : "▼") + ""; res += "
" + showPerc(x[s] / prepared.summary.sumExecution); else res += x[s]; res += "
"; $("#output").html(res); } var currentPlot = null; function showPlot(series, options) { var $output = $("#output"); var width = $output.width(); var height = $output.height(); if (series === null && options === null) { if (width === currentPlot.width && height === currentPlot.height) return; series = currentPlot.series; options = currentPlot.options; } currentPlot = { series: series, options: options, width: width, height: height }; var div = $("
").width(width - 20).height(height - 10); $("#output").html("").append(div); $.plot(div, series, options); } window.onresize = function () { if (currentPlot !== null) showPlot(null, null); }; function runReport() { currentTable = null; currentPlot = null; try { switch (report.mode) { case "summary": var res = showSummary(prepared.summary); var s_2 = $("#welcome").html(); s_2 += "
    "; for (var i = 0; i < res.length; i++) s_2 += "
  • " + res[i] + "
  • "; s_2 += "
"; s_2 += "

Generated by Shake " + version + ".

"; $("#output").html(s_2); break; case "cmd-plot": { var xs = commandPlot(prepared, report.query, 100); var ys = []; for (var s_3 in xs) { var x = xs[s_3].items; var data = []; for (var j = 0; j < x.length; j++) data.push([j, x[j]]); ys.push({ label: s_3, data: data, color: xs[s_3].back, avg: sum(x) / x.length }); } if (ys.length === 0) { $("#output").html("No data found, " + (prepared.summary.countTraceLast === 0 ? "there were no traced commands in the last run." : "perhaps your filter is too restrictive?")); } else { ys.sort(function (a, b) { return a.avg - b.avg; }); showPlot(ys, { legend: { show: true, position: "nw", sorted: "reverse" }, series: { stack: true, lines: { lineWidth: 0, fill: 1 } }, yaxis: { min: 0 }, xaxis: { tickFormatter: function (i) { return showTime(prepared.summary.maxTraceStopLast * i / 100); } } }); } } break; case "cmd-table": showTable(commandTable(prepared, report.query)); break; case "rule-table": showTable(ruleTable(prepared, report.query)); break; case "rule-graph": { var xs = ruleGraph(prepared, report.query); if (xs.length > 250) $("#output").html("Viewing a graph with > 250 nodes is not supported, and you have " + xs.length + " nodes. Try grouping more aggressively"); else if (typeof Viz === 'undefined') $("#output").html("Profile reports do not seem to have been built with GraphViz support, this feature is unavailable."); else { var res_1 = "digraph \"\"{"; res_1 += "graph[nodesep=0.15,ranksep=0.3];"; res_1 += "node[fontname=\"sans-serif\",fontsize=9,penwidth=0.5,height=0,width=0];"; res_1 += "edge[penwidth=0.5,arrowsize=0.5];"; for (var i = 0; i < xs.length; i++) { res_1 += "a" + i + "[label=\"" + xs[i].name.split("\\").join("\\\\").split("\"").join("\\\"") + "\""; if (xs[i].back) res_1 += ",style=filled,color=\"" + xs[i].back + "\""; if (xs[i].text) res_1 += ",fontcolor=\"" + xs[i].text + "\""; res_1 += "];"; var parents = xs[i].parents; for (var j = 0; j < parents.length; j++) res_1 += "a" + i + "->a" + parents[j] + ";"; var ancestors = xs[i].ancestors; for (var j = 0; j < ancestors.length; j++) res_1 += "a" + i + "->a" + ancestors[j] + "[style=dashed];"; } res_1 += "}"; $("#output").html(Viz(res_1, "svg")); } } break; case "help": $("#output").html($("#help").html()); break; default: throw "Unknown report type: " + report.mode; break; } } catch (e) { if (!(e && e.user)) throw e; $("#output").html($("#error").html()); for (var s in e) $("#output ." + s).text(e[s]); } } function example(mode, query) { setReport(function (_) { return new Report(mode, query); }, false, true); return false; } function initProfile() { $(function () { setReport(function (_) { return reportFromURL(); }, true, true); $("#mode,#query").bind("input change", function () { var mode = $("#mode").val(); var query = $("#query").val(); var enable = mode !== report.mode || query !== report.query; $("#run").enable(enable).attr("title", enable ? "" : "The current query is displayed"); $("#link").attr("href", reportToURL(reportFromUser())); }); $("#run").click(function () { setReport(function (_) { return reportFromUser(); }, false, true); }); $("#query").keypress(function (e) { if (e.which == 13) $("#run").click(); }); window.onpopstate = function (e) { setReport(function (_) { return reportFromUser(); }, true, true); }; $("a.example").each(function () { var mode = $(this).attr("data-mode"); var query = $(this).attr("data-query"); if (query === undefined) query = $(this).text(); var href = reportToURL(new Report(mode, query)); var onclick = "return example(decodeURI('" + encodeURI(mode) + "'),decodeURI('" + encodeURI(query) + "'));"; $(this).attr("href", href).attr("target", "_blank")[0].setAttribute("onclick", onclick); }); $("a.shake").each(function () { var href = "https://hackage.haskell.org/packages/archive/shake/latest/doc/html/Development-Shake.html#v:" + $(this).text().replace("'", "-39-"); $(this).attr("href", href).attr("target", "_blank"); }); }); } "use strict"; jQuery.fn.enable = function (x) { return this.each(function () { if (x) $(this).removeAttr('disabled'); else $(this).attr('disabled', 'disabled'); }); }; function uriQueryParameters(s) { var params = {}; var a = /\+/g; var r = /([^&=]+)=?([^&]*)/g; var d = function (s) { return decodeURIComponent(s.replace(a, " ")); }; var q = s.substring(1); while (true) { var e = r.exec(q); if (!e) break; params[d(e[1])] = d(e[2]); } return params; } ; function showTime(x) { function digits(x) { var s = String(x); return s.length === 1 ? "0" + s : s; } if (x >= 3600) { x = Math.round(x / 60); return Math.floor(x / 60) + "h" + digits(x % 60) + "m"; } else if (x >= 60) { x = Math.round(x); return Math.floor(x / 60) + "m" + digits(x % 60) + "s"; } else return x.toFixed(2) + "s"; } function showPerc(x) { return (x * 100).toFixed(2) + "%"; } function plural(n, not1, is1) { if (not1 === void 0) { not1 = "s"; } if (is1 === void 0) { is1 = ""; } return n === 1 ? is1 : not1; } function sum(xs) { var res = 0; for (var _i = 0; _i < xs.length; _i++) { var x = xs[_i]; res += x; } return res; } function testRegExp(r, s) { if (typeof r === "string") return s.indexOf(r) !== -1; else return r.test(s); } function execRegExp(r, s) { if (typeof r === "string") return s.indexOf(r) === -1 ? null : []; else return r.exec(s); } function listEq(xs, ys) { if (xs.length !== ys.length) return false; for (var i = 0; i < xs.length; i++) { if (xs[i] !== ys[i]) return false; } return true; } function cache(key, op) { var store = {}; return function (k) { var s = key(k); if (!(s in store)) store[s] = op(k); return store[s]; }; } function mapEq(xs, ys) { function f(a, b) { for (var s in a) { if (a[s] !== b[s]) return false; } return true; } return f(xs, ys) && f(ys, xs); } function recordCopy(xs) { return mapCopy(xs); } function mapCopy(xs) { var res = {}; for (var s in xs) res[s] = xs[s]; return res; } function mapUnion(xs, ys) { var res = mapCopy(ys); for (var s in xs) res[s] = xs[s]; return res; } function concatNub(xss) { var res = []; var seen = {}; for (var _i = 0; _i < xss.length; _i++) { var xs = xss[_i]; for (var _a = 0; _a < xs.length; _a++) { var x = xs[_a]; var v = x; if (!(v in seen)) { seen[v] = null; res.push(x); } } } return res; } shake-0.16.4/html/progress.html0000644000000000000000000000214713261223301014556 0ustar0000000000000000 Shake progress diagnostic

Shake Progress Analysis

The red line represents the real performance, with time to completion on the X axis and predicted time on the Y axis. A perfect progress prediction would follow the gray line.

Loading...

shake-0.16.4/html/profile.html0000644000000000000000000003243513261223301014355 0ustar0000000000000000 Shake report
Link

Loading...
shake-0.16.4/docs/0000755000000000000000000000000013261223302012005 5ustar0000000000000000shake-0.16.4/docs/shake-progress.png0000644000000000000000000002302013261223302015445 0ustar0000000000000000PNG  IHDR%(%IDATxd97D sJꌺ@fUo|v'l6I%I*~m&]䬟V';I;*g+Ft2;H~ǥ&?>R3ul5 v65{}}09oŻ%`W{>/~97꾷^Yٗ%?xqx0=AG[?w*ϒ-R}aN۱\) []Iwmm w+/T2քK$KV.:eOrkAR "S>/}NY/8%ѷŕޥ<^ w?V>%~~o=m>k.6 n;0|fF=x;9sz"l.#|<|ߟi/d 6'-h<7ۯl i qvj>e,ə=fLecfzv7pW稒֔rѵkjTcuK#ݳs8]}Aϼv#gMky~~' ~H^E)òݕXF6v0h78~ojUyU^Wezk'"Axѓigxx;3G4j)y^y>? \}ѵ10*P_(0XD Կ=T"!<-#Yt 2} Oi(fGf!(1QTa:4R޾X* S#"_`t_0ҀDYkNXT}CitoEaF8l*63Z825*&tyRn.wx7^>ޖNfq%EqcIKPyv_w;rz>2a[_6z&QTb CM(o-Ec=@R.rY"ݑ8Ʉ*EvmFzK ?qBGk5ܬЍ+?3JVIսΞ{imB@w];6Z~4"VVCNvNk} ½̉QcyPzu(Qȫ]q, ?~6}zÌt9A]π`!I,iYybq/_g\{X>Q)zT,z$p_Vd,ruHtIZ|<>ԧl?s9:^[a2  Ot cF}f)F25%;Ԣ@Ih͐5ߙ@:-S1*`=j3,0±sGhEʼDpT7vz`2T%$ǿoȊJp"c&5ZPOX$-K0)G%SY4 vqjDM S`1vjL46 7vN*xp \,?ŸZ*} k`,[Sy5o)355+%GGKzU>e}=F,1a) z!T`I2ε9uMő2|~479s bm@ϐDXL60)gWoiZ`o"[\cG:% 5Tcsc̨ؓofͨ~0=ZsMZ1f5[{ lM@lSVP+_5_k*ipbQG/w| + GQ: 1EQD&nZZkPo4Itgq%vހ}_1|/+>͸q,g.Fs!_]6?zsGFRruKL{.ކO♥uY8w"|X_,tX:|ˇo$w\AϚ 5K^<|a_xA33e#?7? ÝʂLU'5z 3C`n0R({e'#ʞP~dTzbM(^!+vr*=+J)=A( m8@$;A?+J{w0@{)eOf*{w'3ޞ̋*uZ8ͧEq(vTowMssZRk[䆛w߂GOsv?FjtNNpay140ɔgٺ˺xZ=1B\B@d6 s(BJDm DD<j5hL(@ rwQ(l`3C/ r.>_"˲]u"bHd3l;_R]Hl¹c'*O}IP{1 {= *cujtΒ-f1v4ym0Z4Zm$8s.cj2w/]$˔ LӦt&`[cu%8(rub6Ȏ=QG@ fVv7 7xfc%NHr4M#lFB(PK3%J AId(EHRF4ȊhPb Xi L vR'Mx"!(/1ᕗTfw㇎̒]ϳ,ndL ,^3KYS)Ŷ nY~kplp'x3\\qf2P?X (/?1ȡ3)sllYŘ˲8uGw<"ov|<{XؒOЧfJ!+bT2&tü(s c@H0%y(N=h bJ%8X7r2 "p$K$"A1@Pd. f]609N4*$.mHA^*$Zi2ϩC [\amGTA ymË95-`HΝMI!jG#il"PÄ;jIЛ}aR;jYs4Sˠ;Gw8Gb#um;>Mp!p,%qaKft% )Y@9r\_W- -3E33CfN_fN̜/,zn}3sj$5Q]U]@=W2:kL8H7x0xN[n~~%@-';_5k~FO~셊Gzh4rra!fq7ŝOafLt,?C~G|h"N? oLm)R͗v}2HʑJjHLJ) ~ꗘlT哢0"Y,Bi]aDA gEf8HP$ ZHTmO-!!ì X¸^%Älpmq: ې -}mxDÑAqm[q:ـPTK-TמIo>ptl塍pڴ9~,۵|suK*ǿ޷x8C͓)X8r5U+9{ٜ<Խ8-=t!OGJ}~?>,Z—^7OuӆwSqWQ·V_L]G&?~ Mqo||m8foR HXޝ~_)z. SE^XK92'/\UQiiMoK֑iM-kE0(ǪDc&2G265Q͛14o/pnvK E6ZD4![oZb兎|s d]f5ZGߨ/dĈ=c>߹9xc]z;Wp\H+U}>9t?Uz?q̈qvfv](}PA8dg:'}"hm`%}xa.J&Ǧ%)$$oy9_lszn} S"AO? T~bB)K2콵ڢiw+@8! mYb,8\<&O@ӌ̞? åM/={'l[=EbͿô Bk6Kk[(En̋ J,M[ Ɣ`zZ9 z;SۏyZVqcY! jF`|VU7bZjɶ;%SayRv-?{Kj&']yvSk_]ٔT.NgW~}#O'Z\vFk}E?>ܽ{ߵV ӣ3I{) HuJ?WaJ.J3p`\ J-=oݝedcQ!\5[GB"ڶI["ۨ<Te73v z_> $c9\bvϻ5cı&}-=l& ~K~y5#z?n)SGΣk~^Փ*vq ϟy>ܵy+IgGfg+;7Il>=^7"SpġF,V8`@҈ d4W# grY\/x52mH~PT\+È]Ee &1ER@IeЋI0)Wdt?KHddo`Fɒ5"$9]2\뷕޻7zoJ&Wƕ_vqϖ;ͤcV-qkңmߴ앁znE>@+3k㮶 eR HI`#HB;^} e8 CM hl)cz$ߍRPS#m4|;*o0[ar #Kud6uЦX"ش1KH,ƔI-d_XPFPE )ASX#9jMShŤZ8VD [?C8X>1y[v4m/Rn ģ%hknb,6i-=N}l0{_B\2mkTm ovjNvM|wq? j?b&X_;1!I"P\KČ._(z@cc+)hyXX U>6 +!R%nl}G|xw E2Td}XaWGS?Yi; k8C8> d)lىYQ;Y0a exe] phony "clean" $ do putNormal "Cleaning files in _build" removeFilesAfter "_build" ["//*"] "_build/run" <.> exe %> \out -> do cs <- getDirectoryFiles "" ["//*.c"] let os = ["_build" c -<.> "o" | c <- cs] need os cmd_ "gcc -o" [out] os "_build//*.o" %> \out -> do let c = dropDirectory1 $ out -<.> "c" let m = out -<.> "m" cmd_ "gcc -c" [c] "-o" [out] "-MMD -MF" [m] needMakefileDependencies m This build system builds the executable `_build/run` from all C source files in the current directory. It will rebuild if you add/remove any C files to the directory, if the C files themselves change, or if any headers used by the C files change. All generated files are placed in `_build`, and a `clean` command is provided that will wipe all the generated files. In the rest of this manual we'll explain how the above code works and how to extend it. #### Running this example To run the example above: 1. Install the [Haskell Stack](https://haskellstack.org/), which provides a Haskell compiler and package manager. 2. Type `stack install shake`, to build and install Shake and all its dependencies. 3. Type `stack exec -- shake --demo`, which will create a directory containing a sample project, the above Shake script (named `Build.hs`), and execute it (which can be done by `runhaskell Build.hs`). For more details see a [trace of `shake --demo`](Demo.md). ## Basic syntax This section explains enough syntax to write a basic Shake build script. #### Boilerplate The build system above starts with the following boilerplate:
import Development.Shake
import Development.Shake.Command
import Development.Shake.FilePath
import Development.Shake.Util
 
main :: IO ()
main = shakeArgs shakeOptions{shakeFiles="_build"} $ do
    build rules
All the interesting build-specific code is placed under build rules. Many build systems will be able to reuse that boilerplate unmodified. #### Defining targets A target is a file we want the build system to produce (typically executable files). For example, if we want to produce the file `manual/examples.txt` we can write: want ["manual/examples.txt"] The `want` function takes a list of strings. In Shake lists are written `[item1,item2,item2]` and strings are written `"contents of a string"`. Special characters in strings can be escaped using `\` (e.g. `"\n"` for newline) and directory separators are always written `/`, even on Windows. Most files have the same name on all platforms, but executable files on Windows usually have the `.exe` extension, while on POSIX they have no extension. When writing cross-platform build systems (like the initial example), we can write: want ["_build/run" <.> exe] The `<.>` function adds an extension to a file path, and the built-in `exe` variable evaluates to `"exe"` on Windows and `""` otherwise. #### Defining rules A rule describes the steps required to build a file. A rule has two components, a pattern and some actions:
pattern %> \out -> do
    actions
The pattern is a string saying which files this rule can build. It may be a specific file (e.g. `"manual/examples.txt" %> ...`) or may use wildcards: * The `*` wildcard matches anything apart from a directory separator. For example `"manual/*.txt"` would define a rule for any `.txt` file in the `manual` directory, including `manual/examples.txt`, but would not match `manual/examples.zip`, `examples.txt` or `manual/docs/examples.txt`. * The `//` wildcard matches any number of complete path components. For example `//*.txt` would define a rule for any `.txt` file, including `manual/examples.txt`. As another example, `manual//examples.txt` would match any file named `examples.txt` inside `manual`, including both `manual/examples.txt` and `manual/docs/examples.txt`. It is an error for multiple patterns to match a file being built, so you should keep patterns minimal. Looking at the two rules in the initial example: "_build/run" <.> exe %> ... "_build//*.o" %> ... The first matches only the `run` executable, using `<.> exe` to ensure the executable is correctly named on all platforms. The second matches any `.o` file anywhere under `_build`. As examples, `_build/main.o` and `_build/foo/bar.o` both match while `main.o` and `_build/main.txt` do not. Lots of compilers produce `.o` files, so if you are combining two different languages, say C and Haskell, use the extension `.c.o` and `.hs.o` to avoid overlapping rules. The actions are a list of steps to perform and are listed one per line, indented beneath the rule. Actions both express dependencies (say what this rule uses) and run commands (actually generate the file). During the action the `out` variable is bound to the file that is being produced. #### A simple rule Let's look at a simple example of a rule: "*.rot13" %> \out -> do let src = out -<.> "txt" need [src] cmd_ "rot13" src "-o" out This rule can build any `.rot13` file. Imagine we are building `"file.rot13"`, it proceeds by: * Using `let` to define a local variable `src`, using the `-<.>` extension replacement method, which removes the extension from a file and adds a new extension. When `out` is `"file.rot13"` the variable `src` will become `file.txt`. * Using `need` to introduce a dependency on the `src` file, ensuring that if `src` changes then `out` will be rebuilt and that `src` will be up-to-date before any further commands are run. * Using `cmd_` to run the command line `rot13 file.txt -o file.rot13`, which should read `file.txt` and write out `file.rot13` being the ROT13 encoding of the file. Many rules follow this pattern -- calculate some local variables, `need` some dependencies, then use `cmd_` to perform some actions. We now discuss each of the three statements. #### Local variables Local variables can be defined as:
let variable = expression
Where variable is a name consisting of letters, numbers and underscores (a-z, A-Z, 0-9 and \_). All variables _must_ start with a lower-case letter. An expression is any combination of variables and function calls, for example `out -<.> "txt"`. A list of some common functions is discussed later. Variables are evaluated by substituting the expression everywhere the variable is used. In the simple example we could have equivalently written: "*.rot13" %> \out -> do need [out -<.> "txt"] cmd_ "rot13" (out -<.> "txt") "-o" out Variables are local to the rule they are defined in, cannot be modified, and should not be defined multiple times within a single rule. #### File dependencies You can express a dependency on a file with: need ["file.src"] To depend on multiple files you can write: need ["file.1","file.2"] Or alternatively: need ["file.1"] need ["file.2"] It is preferable to use fewer calls to `need`, if possible, as multiple files required by a `need` can be built in parallel. #### Running external commands The `cmd_` function allows you to call system commands, e.g. `gcc`. Taking the initial example, we see: cmd_ "gcc -o" [out] os After substituting `out` (a string variable) and `os` (a list of strings variable) we might get: cmd_ "gcc -o" ["_make/run"] ["_build/main.o","_build/constants.o"] The `cmd_` function takes any number of space-separated expressions. Each expression can be either a string (which is treated as a space-separated list of arguments) or a list of strings (which is treated as a direct list of arguments). Therefore the above command line is equivalent to either of: cmd_ "gcc -o _make/run _build/main.o _build/constants.o" cmd_ ["gcc","-o","_make/run","_build/main.o","_build/constants.o"] To properly handle unknown string variables it is recommended to enclose them in a list, e.g. `[out]`, so that even if `out` contains a space it will be treated as a single argument. The `cmd_` function as presented here will fail if the system command returns a non-zero exit code, but see later for how to treat failing commands differently. #### Filepath manipulation functions Shake provides a complete library of filepath manipulation functions (see the [docs for `Development.Shake.FilePath`](https://hackage.haskell.org/package/shake/docs/Development-Shake-FilePath.html)), but the most common are: * `str1 str2` -- add the path components together with a slash, e.g. `"_build" "main.o"` equals `"_build/main.o"`. * `str1 <.> str2` -- add an extension, e.g. `"main" <.> "o"` equals `"main.o"`. * `str1 ++ str2` -- append two strings together, e.g. `"hello" ++ "world"` equals `"helloworld"`. * `str1 -<.> str2` -- replace an extension, e.g. `"main.c" -<.> "o"` equals `"main.o"`. * `dropExtension str` -- drop the final extension of a filepath if it has one, e.g. `dropExtension "main.o"` equals `"main"`, while `dropExtension "main"` equals `"main"`. * `takeFileName str` -- drop the path component, e.g. `takeFileName "_build/src/main.o"` equals `"main.o"`. * `dropDirectory1 str` -- drop the first path component, e.g. `dropDirectory1 "_build/src/main.o"` equals `"src/main.o"`. ## Advanced Syntax The following section covers more advanced operations that are necessary for moderately complex build systems, but not simple ones. #### Directory listing dependencies The function `getDirectoryFiles` can retrieve a list of files within a directory: files <- getDirectoryFiles "" ["//*.c"] After this operation `files` will be a variable containing all the files matching the pattern `"//*.c"` (those with the extension `.c`) starting at the directory `""` (the current directory). To obtain all `.c` and `.cpp` files in the src directory we can write: files <- getDirectoryFiles "src" ["//*.c","//*.cpp"] The `getDirectoryFiles` operation is tracked by the build system, so if the files in a directory changes the rule will rebuild in the next run. You should only use `getDirectoryFiles` on source files, not files that are generated by the build system, otherwise the results will change while you are running the build and the build may be inconsistent. #### List manipulations Many functions work with lists of values. The simplest operation on lists is to join two lists together, which we do with `++`. For example, `["main.c"] ++ ["constants.c"]` equals `["main.c", "constants.c"]`. Using a _list comprehension_ we can produce new lists, apply functions to the elements and filtering them. As an example: ["_build" x -<.> "o" | x <- inputs] This expression grabs each element from `inputs` and names it `x` (the `x <- inputs`, pronounced "`x` is drawn from `inputs`"), then applies the expression `"_build" x -<.> "o"` to each element. If we start with the list `["main.c","constants.c"]`, we would end up with `["_build/main.o", "_build/constants.o"]`. List expressions also allow us to filter the list, for example we could know that the file `"evil.c"` is in the directory, but should not be compiled. We can extend that to: ["_build" x -<.> "o" | x <- inputs, x /= "evil.c"] The `/=` operator checks for inequality, and any predicate after the drawn from is used to first restrict which elements of the list are available. #### Using `gcc` to collect headers One common problem when building `.c` files is tracking down which headers they transitively import, and thus must be added as a dependency. We can solve this problem by asking `gcc` to create a file while building that contains a list of all the imports. If we run: gcc -c main.c -o main.o -MMD -MF main.m That will compile `main.c` to `main.o`, and also produce a file `main.m` containing the dependencies. To add these dependencies as dependencies of this rule we can call: needMakefileDependencies "main.m" Now, if either `main.c` or any headers transitively imported by `main.c` change, the file will be rebuilt. In the initial example the complete rule is: "_build//*.o" %> \out -> do let c = dropDirectory1 $ out -<.> "c" let m = out -<.> "m" cmd_ "gcc -c" [c] "-o" [out] "-MMD -MF" [m] needMakefileDependencies m We first compute the source file `c` (e.g. `"main.c"`) that is associated with the `out` file (e.g. `"_build/main.o"`). We then compute a temporary file `m` to write the dependencies to (e.g. `"_build/main.m"`). We then call `gcc` using the `-MMD -MF` flags and then finally call `needMakefileDependencies`. #### Top-level variables Variables local to a rule are defined using `let`, but you can also define top-level variables. Top-level variables are defined before the `main` call, for example: buildDir = "_build" You can now use `buildDir` in place of `"_build"` throughout. You can also define parametrised variables (functions) by adding argument names: buildDir x = "_build" x We can now write: buildDir ("run" <.> exe) %> \out -> do ... All top-level variables and functions can be thought of as being expanded wherever they are used, although in practice may have their evaluation shared. #### A clean command A standard clean command is defined as: phony "clean" $ do putNormal "Cleaning files in _build" removeFilesAfter "_build" ["//*"] Running the build system with the `clean` argument, e.g. `runhaskell Build.hs clean` will remove all files under the `_build` directory. This clean command is formed from two separate pieces. Firstly, we can define `phony` commands as:
phony "name" $ do
    actions
Where name is the name used on the command line to invoke the actions, and actions are the list of things to do in response. These names are not dependency tracked and are simply run afresh each time they are requested. The actions can be any standard build actions, although for a `clean` rule, `removeFilesAfter` is typical. This function waits until after any files have finished building (which will be none, if you do `runhaskell Build.hs clean`) then deletes all files matching `//*` in the `_build` directory. The `putNormal` function writes out a message to the console, as long as `--quiet` was not passed. ## Running This section covers how to run the build system you have written. #### Compiling the build system As shown before, we can use `runhaskell Build.hs` to execute our build system, but doing so causes the build script to be compiled afresh each time. A more common approach is to add a shell script that compiles the build system and runs it. In the example directory you will find `build.sh` (Linux) and `build.bat` (Windows), both of which execute the same interesting commands. Looking at `build.sh`: #!/bin/sh mkdir -p _shake ghc --make Build.hs -rtsopts -threaded -with-rtsopts=-I0 -outputdir=_shake -o _shake/build && _shake/build "$@" This script creates a folder named `_shake` for the build system objects to live in, then runs `ghc --make Build.hs` to produce `_shake/build`, then executes `_shake/build` with all arguments it was given. The `-with-rtsopts` flag instructs the Haskell compiler to disable "idle garbage collection", making more CPU available for the commands you are running, as [explained here](https://stackoverflow.com/questions/34588057/why-does-shake-recommend-disabling-idle-garbage-collection/). Now you can run a build by simply typing `stack exec ./build.sh` on Linux, or `stack exec build.bat` on Windows. On Linux you may want to alias `build` to `stack exec ./build.sh`. For the rest of this document we will assume `build` runs the build system. _Warning:_ You should not use the `-threaded` for GHC 7.6 or below because of a [GHC bug](https://ghc.haskell.org/trac/ghc/ticket/7646). If you do turn on `-threaded`, you should include `-qg -qb` in `-with-rtsopts`. #### Command line flags The initial example build system supports a number of command line flags, including: * `build` will compile all files required by `want`. * `build _build/main.o` will compile enough to create `_build/main.o`, ignoring all `want` requirements. * `build clean` will delete the contents of `_build`, because of our `phony` command. * `build --help` will list out all flags supported by the build system, currently 36 flags. Most flags supported by `make` are also supported by Shake based build systems. * `build -j8` will compile up to 8 rules simultaneously, by default Shake uses 1 processor. Most flags can also be set within the program by modifying the `shakeOptions` value. As an example, `build --metadata=_metadata` causes all Shake metadata files to be stored with names such as `_metadata/.shake.database`. Alternatively we can write `shakeOptions{shakeFiles="_metadata"}` instead of our existing `shakeFiles="_build"`. Values passed on the command line take preference over those given by `shakeOptions`. Multiple overrides can be given to `shakeOptions` by separating them with a comma, for example `shakeOptions{shakeFiles="_build", shakeThreads=8}`. #### Progress prediction One useful feature of Shake is that it can predict the remaining build time, based on how long previous builds have taken. The number is only a prediction, but it does take account of which files require rebuilding, how fast your machine is currently running, parallelism settings etc. You can display progress messages in the titlebar of a Window by either: * Running `build --progress` * Setting `shakeOptions{shakeProgress = progressSimple}` The progress message will be displayed in the titlebar of the window, for example `3m12s (82%)` to indicate that the build is 82% complete and is predicted to take a further 3 minutes and 12 seconds. If you are running Windows 7 or higher and place the [`shake-progress`](https://github.org/ndmitchell/shake) utility somewhere on your `%PATH%` then the progress will also be displayed in the taskbar progress indicator: ![](shake-progress.png) Progress prediction is likely to be relatively poor during the first build and after running `build clean`, as then Shake has no information about the predicted execution time for each rule. To rebuild from scratch without running clean (because you really want to see the progress bar!) you can use the argument `--always-make`, which assumes all rules need rerunning. #### Lint Shake features a built in "lint" features to check the build system is well formed. To run use `build --lint`. You are likely to catch more lint violations if you first `build clean`. Sadly, lint does _not_ catch missing dependencies. However, it does catch: * Changing the current directory, typically with `setCurrentDirectory`. You should never change the current directory within the build system as multiple rules running at the same time share the current directory. You can still run `cmd_` calls in different directories using the `Cwd` argument. * Outputs that change after Shake has built them. The usual cause of this error is if the rule for `foo` also writes to the file `bar`, despite `bar` having a different rule producing it. There is a performance penalty for building with `--lint`, but it is typically small. #### Profiling and optimisation Shake features an advanced profiling feature. To build with profiling run `build --report`, which will generate an interactive HTML profile named `report.html`. This report lets you examine what happened in that run, what takes most time to run, what rules depend on what etc. For a full explanation of how to profile and optimise a build system, including getting accurate timings and using Haskell profiling, see [the profiling and optimisation page](Profiling.md). #### Tracing and debugging To debug a build system there are a variety of techniques that can be used: * Run with lint checking enabled (`--lint`), which may spot and describe the problem for you. * Run in single-threaded mode (`-j1`) to make any output clearer by not interleaving commands. * By default a Shake build system prints out a message every time it runs a command. Use verbose mode (`--verbose`) to print more information to the screen, such as which rule is being run. Additional `--verbose` flags increase the verbosity. Three verbosity flags produce output intended for someone debugging the Shake library itself, rather than a build system based on it. * To raise a build error call `error "error message"`. Shake will abort, showing the error message. * To output additional information use `putNormal "output message"`. This message will be printed to the console when it is reached. * To show additional information with either `error` or `putNormal`, use `error $ show ("message", myVariable)`. This allows you to show any local variables. ## Extensions This section details a number of build system features that are useful in some build systems, but not the initial example, and not most average build systems. #### Advanced `cmd` usage The `cmd_` has a related function `cmd` that can also obtain the stdout and stderr streams, along with the exit code. As an example: (Exit code, Stdout out, Stderr err) <- cmd "gcc --version" Now the variable `code` is bound to the exit code, while `out` and `err` are bound to the stdout and stderr streams. If `ExitCode` is not requested then any non-zero return value will raise an error. Both `cmd_` and `cmd` also takes additional parameters to control how the command is run. As an example: cmd_ Shell (Cwd "temp") "pwd" This runs the `pwd` command through the system shell, after first changing to the `temp` directory. #### Dependencies on environment variables You can use tracked dependencies on environment variables using the `getEnv` function. As an example: link <- getEnv "C_LINK_FLAGS" let linkFlags = fromMaybe "" link cmd_ "gcc -o" [output] inputs linkFlags This example gets the `$C_LINK_FLAGS` environment variable (which is `Maybe String`, namely a `String` that might be missing), then using `fromMaybe` defines a local variable `linkFlags` that is the empty string when `$C_LINK_FLAGS` is not set. It then passes these flags to `gcc`. If the `$C_LINK_FLAGS` environment variable changes then this rule will rebuild. #### Dependencies on extra information Using Shake we can depend on arbitrary extra information, such as the version of `gcc`, allowing us to automatically rebuild all C files when a different compiler is placed on the path. To track the version, we can define a rule for the file `gcc.version` which changes only when `gcc --version` changes: "gcc.version" %> \out -> do alwaysRerun Stdout stdout <- cmd "gcc --version" writeFileChanged out stdout This rule has the action `alwaysRerun` meaning it will be run in every execution that requires it, so the `gcc --version` is always checked. This rule defines no dependencies (no `need` actions), so if it lacked `alwaysRerun`, this rule would only be run when `gcc.version` was missing. The function then runs `gcc --version` storing the output in `stdout`. Finally, it calls `writeFileChanged` which writes `stdout` to `out`, but only if the contents have changed. The use of `writeFileChanged` is important otherwise `gcc.version` would change in every run. To use this rule, we `need ["gcc.version"]` in every rule that calls `gcc`. Shake also contains a feature called "oracles", which lets you do the same thing without the use of a file, which is sometimes more convenient. Interested readers should look at the function documentation list for `addOracle`. #### Resources Resources allow us to limit the number of simultaneous operations more precisely than just the number of simultaneous jobs (the `-j` flag). For example, calls to compilers are usually CPU bound but calls to linkers are usually disk bound. Running 8 linkers will often cause an 8 CPU system to grid to a halt. We can limit ourselves to 4 linkers with: disk <- newResource "Disk" 4 want [show i <.> "exe" | i <- [1..100]] "*.exe" %> \out -> do withResource disk 1 $ do cmd_ "ld -o" [out] ... "*.o" %> \out -> do cmd_ "cl -o" [out] ... Assuming `-j8`, this allows up to 8 compilers, but only a maximum of 4 linkers. #### Multiple outputs Some tools, for example [bison](https://www.gnu.org/software/bison/), can generate multiple outputs from one execution. We can track these in Shake using the `&%>` operator to define rules: ["//*.bison.h","//*.bison.c"] &%> \[outh, outc] -> do let src = outc -<.> "y" cmd_ "bison -d -o" [outc] [src] Now we define a list of patterns that are matched, and get a list of output files. If any output file is required, then all output files will be built, with proper dependencies. #### Changing build rules Shake build systems are set up to rebuild files when the dependencies change, but mostly assume that the build rules themselves do not change (including both the code and the shell commands contained within). To minimise the impact of build rule changes there are three approaches: _Use configuration files:_ Most build information, such as which files a C file includes, can be computed from source files. Where such information is not available, such as which C files should be linked together to form an executable, use configuration files to provide the information. The rule for linking can use these configuration files, which can be properly tracked. Moving any regularly changing configuration into separate files will significantly reduce the number of build system changes. _Depend on the build source:_ One approach is to depend on the build system source in each of the rules, then if _any_ rules change, _everything_ will rebuild. While this option is safe, it may cause a significant number of redundant rebuilds. As a restricted version of this technique, for a generated file you can include a dependency on the generator source and use `writeFileChanged`. If the generator changes it will rerun, but typically only a few generated files will change, so little is rebuilt. _Use a version stamp:_ There is a field named `shakeVersion` in the `ShakeOptions` record. If the build system changes in a significant and incompatible way, you can change this field to force a full rebuild. If you want all rules to depend on all rules, you can put a hash of the build system source in the version field, as [described here](https://stackoverflow.com/questions/18532552/shake-how-to-reliably-automatically-force-rebuild-when-my-rules-change-becomi/18532553#18532553). ## The Haskell Zone From now on, this manual assumes some moderate knowledge of Haskell. Most of the things in this section are either impossible to do with other build systems or can be faked by shell script. None of the Haskell is particularly advanced. #### Haskell Expressions You can use any Haskell function at any point. For example, to only link files without numbers in them, we can `import Data.Char` and then write: let os = ["_build" c -<.> "o" | c <- inputs, not $ any isDigit c] For defining non-overlapping rules it is sometimes useful to use a more advanced predicate. For example, to define a rule that only builds results which have a numeric extension, we can use the `?>` rule definition function: (\x -> all isDigit $ drop 1 $ takeExtension x) ?> \out -> do ... We first get the extension with `takeExtension`, then use `drop 1` to remove the leading `.` that `takeExtension` includes, then test that all the characters are numeric. The standard `%>` operator is actually defined as: pattern %> actions = (pattern ?==) ?> actions Where `?==` is a function for matching file patterns. #### Haskell Actions You can run any Haskell `IO` action by using `liftIO`. As an example: liftIO $ launchMissiles True Most common IO operations to run as actions are already wrapped and available in the Shake library, including `readFile'`, `writeFile'` and `copyFile'`. Other useful functions can be found in `System.Directory`. shake-0.16.4/docs/manual/0000755000000000000000000000000013261223301013261 5ustar0000000000000000shake-0.16.4/docs/manual/main.c0000644000000000000000000000014713261223301014353 0ustar0000000000000000#include #include "constants.h" int main() { printf("%s\n", message()); return 0; } shake-0.16.4/docs/manual/constants.h0000644000000000000000000000002113261223301015437 0ustar0000000000000000char* message(); shake-0.16.4/docs/manual/constants.c0000644000000000000000000000011113261223301015432 0ustar0000000000000000 char msg[] = "Hello Shake Users!"; char* message() { return msg; } shake-0.16.4/docs/manual/build.sh0000644000000000000000000000021213261223301014707 0ustar0000000000000000#!/bin/sh mkdir -p _shake ghc --make Build.hs -rtsopts -threaded -with-rtsopts=-I0 -outputdir=_shake -o _shake/build && _shake/build "$@" shake-0.16.4/docs/manual/Build.hs0000644000000000000000000000132013261223301014650 0ustar0000000000000000import Development.Shake import Development.Shake.Command import Development.Shake.FilePath import Development.Shake.Util main :: IO () main = shakeArgs shakeOptions{shakeFiles="_build"} $ do want ["_build/run" <.> exe] phony "clean" $ do putNormal "Cleaning files in _build" removeFilesAfter "_build" ["//*"] "_build/run" <.> exe %> \out -> do cs <- getDirectoryFiles "" ["//*.c"] let os = ["_build" c -<.> "o" | c <- cs] need os cmd "gcc -o" [out] os "_build//*.o" %> \out -> do let c = dropDirectory1 $ out -<.> "c" let m = out -<.> "m" () <- cmd "gcc -c" [c] "-o" [out] "-MMD -MF" [m] needMakefileDependencies m shake-0.16.4/docs/manual/build.bat0000755000000000000000000000020613261223301015051 0ustar0000000000000000@mkdir _shake 2> nul @ghc --make Build.hs -rtsopts -threaded -with-rtsopts=-I0 -outputdir=_shake -o _shake/build && _shake\build %*