shake-0.15.5/0000755000000000000000000000000012560222036011061 5ustar0000000000000000shake-0.15.5/shake.cabal0000644000000000000000000001741112560222036013144 0ustar0000000000000000cabal-version: >= 1.10 build-type: Simple name: shake version: 0.15.5 license: BSD3 license-file: LICENSE category: Development, Shake author: Neil Mitchell maintainer: Neil Mitchell copyright: Neil Mitchell 2011-2015 synopsis: Build system library, like Make, but more accurate dependencies. description: Shake is a Haskell library for writing build systems - designed as a replacement for @make@. See "Development.Shake" for an introduction, including an example. Further examples are included in the Cabal tarball, under the @Examples@ directory. The homepage contains links to a user manual, an academic paper and further information: . To use Shake the user writes a Haskell program that imports "Development.Shake", defines some build rules, and calls the 'Development.Shake.shakeArgs' function. Thanks to do notation and infix operators, a simple Shake build system is not too dissimilar from a simple Makefile. However, as build systems get more complex, Shake is able to take advantage of the excellent abstraction facilities offered by Haskell and easily support much larger projects. The Shake library provides all the standard features available in other build systems, including automatic parallelism and minimal rebuilds. Shake also provides more accurate dependency tracking, including seamless support for generated files, and dependencies on system information (e.g. compiler version). homepage: http://shakebuild.com bug-reports: https://github.com/ndmitchell/shake/issues tested-with: GHC==7.10.1, GHC==7.8.4, GHC==7.6.3, GHC==7.4.2, GHC==7.2.2 extra-doc-files: CHANGES.txt README.md extra-source-files: src/Test/C/constants.c src/Test/C/constants.h src/Test/C/main.c src/Test/MakeTutor/Makefile src/Test/MakeTutor/hellofunc.c src/Test/MakeTutor/hellomake.c src/Test/MakeTutor/hellomake.h src/Test/Tar/list.txt src/Test/Ninja/*.ninja src/Test/Ninja/subdir/*.ninja src/Test/Ninja/*.output src/Test/Progress/*.prog src/Test/Tup/hello.c src/Test/Tup/root.cfg src/Test/Tup/newmath/root.cfg src/Test/Tup/newmath/square.c src/Test/Tup/newmath/square.h src/Paths.hs docs/Manual.md docs/Ninja.md docs/Why.md docs/shake-progress.png data-files: html/viz.js html/profile.html html/progress.html html/shake-logic.js html/shake-progress.js html/shake-ui.js html/shake-util.js docs/manual/build.bat docs/manual/Build.hs docs/manual/build.sh docs/manual/constants.c docs/manual/constants.h docs/manual/main.c source-repository head type: git location: https://github.com/ndmitchell/shake.git flag portable default: False description: Obtain FileTime using portable functions library default-language: Haskell2010 hs-source-dirs: src build-depends: base == 4.*, old-time, directory, hashable >= 1.1.2.3, binary, filepath, process >= 1.1, unordered-containers >= 0.2.1, bytestring, utf8-string >= 0.3, time, random, js-jquery, js-flot, transformers >= 0.2, extra >= 1.3, deepseq >= 1.1 if flag(portable) cpp-options: -DPORTABLE else if !os(windows) build-depends: unix >= 2.5.1 exposed-modules: Development.Shake Development.Shake.Classes Development.Shake.Command Development.Shake.Config Development.Shake.FilePath Development.Shake.Rule Development.Shake.Util other-modules: Development.Ninja.Env Development.Ninja.Lexer Development.Ninja.Parse Development.Ninja.Type Development.Shake.Args Development.Shake.ByteString Development.Shake.Core Development.Shake.Database Development.Shake.Demo Development.Shake.Derived Development.Shake.Errors Development.Shake.FileInfo Development.Shake.FilePattern Development.Shake.Monad Development.Shake.Pool Development.Shake.Profile Development.Shake.Progress Development.Shake.Resource Development.Shake.Rules.Directory Development.Shake.Rules.File Development.Shake.Rules.Files Development.Shake.Rules.Oracle Development.Shake.Rules.OrderOnly Development.Shake.Rules.Rerun Development.Shake.Shake Development.Shake.Special Development.Shake.Storage Development.Shake.Types Development.Shake.Value General.Bilist General.Binary General.Cleanup General.Concurrent General.Extra General.Intern General.Process General.String General.Template General.Timing Paths_shake executable shake default-language: Haskell2010 hs-source-dirs: src ghc-options: -main-is Run.main main-is: Run.hs ghc-options: -threaded -rtsopts if impl(ghc >= 7.4) ghc-options: "-with-rtsopts=-I0 -qg -qb" else -- -qg/-qb segfaults in GHC 7.2 ghc-options: -with-rtsopts=-I0 build-depends: base == 4.*, old-time, directory, hashable >= 1.1.2.3, binary, filepath, process >= 1.1, unordered-containers >= 0.2.1, bytestring, utf8-string >= 0.3, time, random, js-jquery, js-flot, transformers >= 0.2, extra >= 1.3, deepseq >= 1.1 if flag(portable) cpp-options: -DPORTABLE else if !os(windows) build-depends: unix >= 2.5.1 other-modules: Development.Make.All Development.Make.Env Development.Make.Parse Development.Make.Rules Development.Make.Type Development.Ninja.All Run test-suite shake-test default-language: Haskell2010 type: exitcode-stdio-1.0 ghc-options: -main-is Test.main main-is: Test.hs hs-source-dirs: src ghc-options: -threaded -rtsopts build-depends: base == 4.*, old-time, directory, hashable >= 1.1.2.3, binary, filepath, process >= 1.1, unordered-containers >= 0.2.1, bytestring, utf8-string >= 0.3, time, random, js-jquery, js-flot, transformers >= 0.2, deepseq >= 1.1, extra >= 1.3, QuickCheck >= 2.0 if flag(portable) cpp-options: -DPORTABLE else if !os(windows) build-depends: unix >= 2.5.1 other-modules: Development.Make.All Development.Make.Env Development.Make.Parse Development.Make.Rules Development.Make.Type Development.Ninja.All Development.Ninja.Parse Development.Ninja.Type Run Test.Type Test.Assume Test.Basic Test.Benchmark Test.C Test.Cache Test.Command Test.Config Test.Digest Test.Directory Test.Docs Test.Errors Test.FilePath Test.FilePattern Test.Files Test.Journal Test.Lint Test.Live Test.Makefile Test.Manual Test.Monad Test.Ninja Test.Oracle Test.OrderOnly Test.Pool Test.Progress Test.Random Test.Resources Test.Self Test.Tar Test.Throttle Test.Tup Test.Unicode Test.Util Test.Verbosity Test.Version shake-0.15.5/Setup.hs0000644000000000000000000000005612560222036012516 0ustar0000000000000000import Distribution.Simple main = defaultMain shake-0.15.5/README.md0000644000000000000000000000564412560222036012351 0ustar0000000000000000# Shake [![Hackage version](https://img.shields.io/hackage/v/shake.svg?style=flat)](https://hackage.haskell.org/package/shake) [![Build Status](https://img.shields.io/travis/ndmitchell/shake.svg?style=flat)](https://travis-ci.org/ndmitchell/shake) Shake is a tool for writing build systems - an alternative to make, Scons, Ant etc. Shake has been used commercially for over five years, running thousands of builds per day. The website for Shake users is at [shakebuild.com](http://shakebuild.com). #### Documentation * **Why use Shake?** Shake lets you write large robust build systems, which deal properly with generated source files and run quickly. If you are writing a custom build system of any moderate size (more than a few rules) you should use Shake. The advantages over other build systems are detailed in the document [Why choose Shake?](https://github.com/ndmitchell/shake/blob/master/docs/Why.md#readme). * **How do I use Shake?** Shake is a Haskell library that you use to define your rules. The [Shake manual](https://github.com/ndmitchell/shake/blob/master/docs/Manual.md#readme) provides a walk through of a small but realistic example, assuming no Haskell knowledge. * [Generated documentation](http://hackage.haskell.org/packages/archive/shake/latest/doc/html/Development-Shake.html) for all functions, includes lots of examples. * [Running Ninja builds](https://github.com/ndmitchell/shake/blob/master/docs/Ninja.md#readme) using Shake. * [Blog posts](http://neilmitchell.blogspot.co.uk/search/label/shake) detailing ongoing development work. * [Profile report demo](https://cdn.rawgit.com/ndmitchell/shake/35fbe03c8d3bafeae17b58af89497ff3fdd54b22/html/demo.html) explaining what the profile reports mean. * [Academic paper](http://community.haskell.org/~ndm/downloads/paper-shake_before_building-10_sep_2012.pdf) on the underlying principles behind Shake. * [Video](http://www.youtube.com/watch?v=xYCPpXVlqFM) of a talk introducing Shake. #### Other links * [Download the Haskell package](http://hackage.haskell.org/package/shake) from Hackage and install it using Cabal. * [Mailing list](https://groups.google.com/forum/?fromgroups#!forum/shake-build-system) for any questions/bugs/thoughts on Shake. If you need more information and aren't sure where to start, use the mailing list. * [Questions](http://stackoverflow.com/questions/tagged/shake-build-system) can be asked on StackOverflow with the tag `shake-build-system`. * [Bugs](https://github.com/ndmitchell/shake/issues) can be reported on the GitHub issue tracker. * [Source code](http://github.com/ndmitchell/shake) in a git repo, stored at GitHub. * Continuous integration with [Travis](https://travis-ci.org/ndmitchell/shake) and [Hydra](http://hydra.cryp.to/jobset/shake/master). Is your company using Shake? Write something public (even just a [tweet to `@ndm_haskell`](https://twitter.com/ndm_haskell)) and I'll include a link at [on the website](http://shakebuild.com/#who-uses-shake). shake-0.15.5/LICENSE0000644000000000000000000000317312560222036012072 0ustar0000000000000000Copyright Neil Mitchell 2011-2015. All rights reserved. Redistribution and use in source and binary forms, with or without modification, are permitted provided that the following conditions are met: * Redistributions of source code must retain the above copyright notice, this list of conditions and the following disclaimer. * Redistributions in binary form must reproduce the above copyright notice, this list of conditions and the following disclaimer in the documentation and/or other materials provided with the distribution. * Neither the name of Neil Mitchell nor the names of other contributors may be used to endorse or promote products derived from this software without specific prior written permission. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. Some of the JavaScript files in html/ have different copyright and licenses. Please consult the corresponding source file in js-src/ shake-0.15.5/CHANGES.txt0000644000000000000000000004605312560222036012702 0ustar0000000000000000Changelog for Shake 0.15.5 #279, make usingConfigFile do a need on the config file Fix a bug where predicted progress could sometimes be ?? #264, make the the suite more less non-deterministic 0.15.4 Undo a locally modified file 0.15.3 #254, in staunch mode, print out all exceptions Require extra-1.3 #259, discount time waiting for a resource in profiles #252, have the docs test configure not in dist 0.15.2 #248, add another example of using cmd #245, initial support for fsatrace lint checking Reexport -<.> from filepath where available Hoogle #106, trigger on filepath version, not GHC version Add AddEnv and AddPath command options #243, close fds in child processes when spawning commands Make Ninja variable handling more accurate 0.15.1 If you have Shakefile.hs, pass it all arguments without interp Add shakeArgsPrune and shakeArgsPruneWith #228, allow running cmd async by collecting the ProcessHandle Make getShakeOptions/processorCount of 0 return the used value #212, document how to get a full terminal with cmd #225, warn if there are no want/action statements #232, don't ignore phony order-only dependencies in Ninja #226, add escaping for GraphViz labels #227, add StdinBS for passing a bytestring as Stdin Make cmd Timeout call terminateProcess as well 0.15 #203, make shakeFiles a directory rather than a file prefix #220, add getHashedShakeVersion helper #220, add shakeVersionIgnore to ignore version numbers #219, run Shakefile.hs from the shake binary #218, fix issues with incorrect unchanging with no digests #218, fix issue with ChangeModtimeAndDigest on unchanging files #216, work around GHC 7.10 RC3 bug 10176 #213, add phonys, a predicate phony rule Add CmdTime and CmdLine results to cmd/command Fix parseMakefile for words with multiple escapes in them #205, add WithStdout, like WithStderr #27, add support for capturing Stdout/Stderr with bytestrings Add FileStdout/FileStderr to write a stream direct to a file #211, add Stdouterr to capture both Stdout and Stderr streams Require extra-1.1 (to use nubOrd) Generalise cmd to work with Maybe [String] Add unit for use with cmd IMPORTANT: Incompatible on disk format change #209, improve orderOnly dependencies 0.14.3 Support for the filepath shipped with GHC 7.10 Add Timeout option to command 0.14.2 #198, add operator to join FilePatterns #198, fix the <.> and other extension methods to work with // 0.14.1 #196, change the links to point at http://www.shakebuild.com/ Improve the error messages when &%> or &?> go wrong Normalise file patterns used by &?> 0.14 Make FilePattern on Windows cope with all path separators Rename normalise to normaliseEx Expose native System.FilePath in Development.Shake.FilePath #193, eliminate the custom version of #193, make need/orderOnly call normalise on their arguments #177, use the $PATH variable to resolve cmd #182, use js-jquery and js-flot #195, add getConfigKeys #194, replace *> with %> as the operator for rules #188, improve the docs on FilePattern #187, removeFiles on a missing directory should do nothing #166, add withTempFile and withTempDir in the Action monad Ensure Ninja response files are always deleted Fix incorrect units when showing profile to the console Require the extra library #178, make --help show the current executable name 0.13.4 #171, fix the --demo mode on Linux 0.13.3 Ensure you wait until the progress thread cleans up Add --demo mode Add --progress=record and --progress=replay Add -j to run with one thread per processor Fix progress reporting to work even on the first run Apply abbreviations to staunch failure messages Fix failure progress messages during staunch mode #165, make |*> matching with simple files much faster Add toStandard to the FilePath module #130, support msvc_deps_prefix in Ninja #157, allow variable substitution in include/subninja statements #147, add the version in the profile reports #147, add --numeric-version flag #28, switch to continuation passing style, uses far less threads #156, add readConfigFileWithEnv for configs with an initial env #129, fix processor count detection on Mac with -threaded #155, add --live and shakeLiveFiles #153, handle escaped spaces in parseMakefile #152, fix typo in the manual Add copyFileChanged Fix the leaf column in profile reports Add Maybe instances for cmd arguments Add a --profile flag (alias for --report) Switch to an underlying continuation monad 0.13.2 #95, ensure progress never gets corrupted #124, add a profile report demo #128, allow long Ninja command lines Fix --report=- for builds with no commands in them 0.13.1 Remove all package upper bounds #126, Ninja compatibility if Ninja fails to create a file #123, generate Chrome compatible traces 0.13 #122, make --report=- write a report to stdout Improve the profile report summary #122, turn shakeReport into a list of files, instead of a Maybe #60, improve how command lines are printed out #113, print info about copyFile' and removeFilesAfter at -V Replace **> with |*> , ?>> with &?> and *>> with &*> IMPORTANT: Incompatible on disk format change #83, support digest rules for files #83, add shakeChange parameter and --digest* args #83, add equalValue function to Rule typeclass Deprecate defaultRule Make literal *> matches take precedence over wildcard matches #120, add a priority function 0.12 #62, move to a ReaderT/IORef for the Action monad Add DEPRECATED pragmas on system' calls Delete Development.Shake.Sys, use command or cmd instead Add a 'console' pool to Ninja Avoid using System.Cmd (deprecated in GHC HEAD) #41, use higher precision file times on POSIX #117, use higher precision times for Unicode files on Windows #118, add support for Ninja -t compdb #119, more test fixes for Linux GHC 7.8 0.11.7 #119, test fixes for Linux GHC 7.8 0.11.6 #114, run build.sh in the test suite with sh #115, fix the test suite on Linux GHC 7.8 #116, fix for GHC 7.10 0.11.5 Include the sources for flot and jQuery in the dist tarball Note that .. in FilePattern values is unlikely to be correct #109, make removeFiles delete directories that match #84, make removeFiles not delete empty unmatched directories #111, fixes to the user manual #110, don't give a warning if -threaded is turned off #103, don't suggest -qg and -qb, they fail without -threaded #102, fix up when the LHS starts with . 0.11.4 Work with QuickCheck-2.7 (which defines ===) #100, handle GraphViz missing gracefully Fix up the profiling report generation #99, add getEnvWithDefault 0.11.3 #97, fix a serialisation bug when > 254 arguments to need 0.11.2 #96, fix a bug in addPath that caused $PATH to be added twice 0.11.1 #94, GHC 7.8 support Add a Config module #89, support :: as a build rule separator 0.11 Add alternatives to allow overlapping rules Make storedValue take a ShakeOptions structure Generalise the newCache function Improve the performance of the Ninja parser Make the database more compact #84, ensure you normalise removeFile patterns first #82, make -j0 guess at the number of processors #81, add --lint-tracker to use tracker.exe Add trackRead, trackWrite Add trackUse, trackChange, trackAllow #85, move rule creation functions into Development.Shake.Rule Mark Development.Shake.Sys as DEPRECATED with a pragma Change shakeLint to be of type Maybe Lint, instead of Bool #50, add shakeArgsAccumulate 0.10.10 Improve Ninja --lint checking 0.10.9 #76, add Ninja specific lint checking #75, add orderOnly dependencies #76, add needed, to assert a need doesn't rebuild #78, don't use cabal in the test suite #77, add shakeRunCommands and --skip-commands #67, add withVerbosity #51, add getShakeOptions Lots of Haddock improvements Deprecate system', use cmd or command instead #53, add addPath and addEnv to modify environment variables Make all search path things return the native separators #73, if Haskell can't write a unicode filename, skip the test Print out the entire database in diagnostics mode Rewrite database writes, old code could break metadata on error #25, optimise building with Ninja Fix many bugs in FilePath.normalise Require QuickCheck-2.0 or higher Change how the makefile parser treats \ characters Add a --no-build flag #74, allow cmd to be used with result IO Add a Util module, with Makefile parsing 0.10.8 Allow unix-1.5.1 again Require Cabal 1.10 or above Convert to the cabal test compatible test suite 0.10.7 Support time-1.2 #71, fix a bug when requesting all three outputs from command #64, add progressProgram and use it in progressSimple Remove isRunning, kill the progress thread on completion #47, improve the performance of ?==, especially on "//*" #68, improve the docs for addOracle #55, ensure if you need phony targets you rebuild every time #52, ensure all command output is printed #20, document preferred compilation flags #20, speed up the Shake executable with GHC RTS flags #39, print out more of the version string on database change #41, require unix-2.6 or above #48, make it clear filetimes are hashes #43, improve the error messages on lint failures #45, avoid use of nub #45, avoid use of intersect which is O(n^2) #26, add newThrottle to create throttling resources #26, add unsafeExtraThread to run more than shakeThreads #32, add a check that withResources does not call need #614, support unicode filenames Require unordered-containers-0.2.1 or above Reduce stack usage when lint checking #24, create the output directories for the multiple file rule Improvements to match Ninja behaviour 0.10.6 Include command/cmd in the default module #16, change the scoping in subninja #15, fix up the Ninja default targets Fix up --version output 0.10.5 Improve progress prediction for the first build Fix removeFiles when there is a directory argument and a pattern Delete shakeDeterministic, shakeThreads=1 has the same effect 0.10.4 Fix writeFileChanged for line endings on Windows Support arguments to --progress Set the number of capabilities based on -j flags Add shakeTimings, to time the separate stages Add AssumeSkip, mostly for benchmarking Normalise file arguments before calling want Expose ShakeException Add isFailure to ShakeProgress, and display in progressDisplay Add withResources Add -<.> alias for replaceExtension Add selectors on Stdout/Stdin/Exit Add CmdResult instance for ExitCode 0.10.3 Upgrade to Flot-0.8 Small documentation markup fixes 0.10.2 Require process-1.1 or above Improve progress guesses (use exponential decay of work rate) Improve profiling report error messages Add a Command module, with alternatives to system' etc. #593, don't assume doesDirectoryExist = not . doesFileExist #594, swap how traced messages are printed out Add Chatty verbosity, for what was Loud Make repeated --verbose keep increasing the verbosity If shakeThreads==1 then always use a deterministic pool Add shakeLineBuffering, defaulting to True Improve the performance of profiling Documentation and corrections improvements Eliminate StepKey from the profiling output Add a --no-time flag Makefile improvements Avoid use of ghc-options in .cabal file 0.10.1 Allow the shake executable to build 0.10 Add phony, for writing phony actions Add removeFilesAfter Switch to the new profiling code Add actionOnException and actionFinally Add shakeArgsWith Rename shakeWithArgs to shakeArgs and change the clean argument Remove the -f alias for --flush Add a shake executable which runs Makefiles Rename shake to shake-test Change how progress messages are written Do not write a beep in progressDisplay, avoids a beep Add exe function to FilePath 0.9.1 GHC head (7.7) compatibility by removing the Rules fundep 0.9 Make the tests work on Linux Fix report generation on machines whose text format is not ASCII Make Directory functions work the same on Linux/Mac Change shakeVersion to be a String Stop alwaysRerun causing lint failures Make shakeLint check that the current directory does not change Add shakeOutput setting Add removeFiles function Add Sys module for writing consise system calls Add getEnv function Add shakeWithArgs and shakeOptDescrs Add newCache to cache the parsed contents of files Add newResourceIO and move newResource into the Rules monad Add shakeStorageLog, to help diagnosing obscure database errors 0.8 Improve the Oracle documentation Allow getDirectoryFiles to operate recursively 0.7 Change getDirectoryFiles to take a set of file patterns Add doesDirectoryExist function Add shakeAbbreviations feature Add a new Progress module for sensible progress messages Spawn shakeProgress on a separate thread, a safer default Improve the runtime error messages Add a quietly function, to hide traced commands Print main status messages when running a traced command Display the exit code when a system command fails Fix AssumeClean when the result exists but has never been built IMPORTANT: Incompatible on disk format change Change the storage to not write messages on --silent Add Applicative on Rules Add Applicative on Action 0.6 Make addOracle return a result of askOracle Export the class bodies from Classes Don't export ShakeValue from Classes 0.5 IMPORTANT: Incompatible on disk format change Add the ShakeValue constraint synonym Change the Oracle to be strongly typed Add a Classes module 0.4 Add shakeFlush to control how often flush is called Fix a serious space leak in the thread pool #502, add shakeAssume to assume files are clean/dirty Remove validStored, replace with storedValue Remove the default validStored class, almost never right Remove Eq/Ord from ShakeOptions (now contains a func) #501, add statistics gathering functions to help report progress Ensure registering witnesses is thread safe Ensure GHC does not over-optimise type registration 0.3.10 Add Paths.hs so you can run the tests from the tarball 0.3.9 Don't rely on toList returning a consistent order Allow hashable-1.2 0.3.8 Fix up FilePattern so "foo//bar" ?== "foo/bar" 0.3.7 Update the cabal file 0.3.6 Add addOracles, for implementing more advanced oracles Add withoutActions, for implementing command line arguments 0.3.5 #571, vastly improve the correctness of FilePattern #574, documentation typos Expose rulePriority 0.3.4 Update documentation with links to ICFP 2012 paper/talk 0.3.3 Minor refactorings 0.3.2 Fix cabal specification on non-Windows 0.3.1 Don't use syntax also stolen by QuasiQuotes Avoid warnings about unused monadic values Fix up getModificationTime for GHC 7.6 Don't assume catch is exported by Prelude (GHC 7.6 compat) Improve the error message when failing to build with *>> or ?>> Introduce a portable flag, to switch to portable FileTime ops Improve the error message when failing to build a file Ensure errors raised bubble up quickly Significant improvements to the profile output Only trace the program name by default, not the entire command Allow unordered-containers < 0.2 again Intern all keys, so they are only stored once Optimise modification time checking on Unix Optimise modification time checking on Windows Make some fields smaller on disk IMPORTANT: Incompatible on disk format change Switch to ByteString for storing File Add shakeDeterministic, along with a default random pool Make the Monad/Monoid instance for Rules strict 0.3 #550, ensure the journal thread is locked #550, if your thread dies while reading the database then die #550, ensure you reset to before the slop before writing If the witness table is completely corrupt give a better error Make storage work with exceptions whose messages raise errors Journal error conditions, to ensure they don't repeat Better messages on invalid database versions Complete rewrite of the storage layer and journal design IMPORTANT: Incompatible on disk format change 0.2.11 IMPORTANT: #546, don't save the database to where it was created 0.2.10 Add systemCwd for running with a particular current directory Ensure the database is saved to where it was created #545, make sure if shake is killed, it kills its children 0.2.9 Require unordered-containers 2.1 or higher, fixes bugs Allow transformers-0.3.* Add a random script generator and tester Improve the documentation for withResource Fix the unordered-containers constraint, was too loose Don't write report generation messages in Quiet mode Add ?>> operator, a generalisation of *>> 0.2.8 Rename shakeDump to shakeReport Add profile report generator 0.2.7 #530, require unordered-containers >= 0.1.4.3 0.2.6 Improve the documentation code fragments (more links) Add support for managing finite resources 0.2.5 Require hashable-1.1.2.3, which has a TypeRep instance Add Data/Typeable instances for ShakeOptions Add Data instance for Verbosity 0.2.4 Include the C example source files Significant documentation improvements 0.2.3 Create the shakeFiles directory if missing 0.2.2 Allow deepseq-1.3.* Add a basic lint checking mode Remove the Dirty state entirely (was incorrect) 0.2.1 Put diagnostics in more places Add a C example 0.2 Add shakeStaunch Rename shakeParallel to shakeThreads Delete the lint support (never actually worked) Completely rewrite the central build algorithm Add verbosity Diagnostic Improve FilePath.normalise Add writeFileChanged Add systemOutput Add an Oracle module Add an explicit Verbosity type Change to lower precedence for *> etc 0.1.5 Allow deepseq-1.2 0.1.4 Don't export currentRule/currentStack Fix **>, was matching the wrong way round Fix FilePath normalise so it works properly on Windows Properly detect recursive rules, instead of looping Add *>> for building multiple files at once 0.1.3 Fix a bug where a file rule could return with a lazy error Make sure all the files are stored in binary, not text Fix a pattern match error in getDirectoryFiles 0.1.2 Add a warning in the description 0.1.1 Significantly improved documentation system' now takes an initial argument for the program 0.1 Many changes to signatures, some basic documentation Additional demo, to build shake itself 0.0 Initial version, not ready for public use shake-0.15.5/src/0000755000000000000000000000000012560222036011650 5ustar0000000000000000shake-0.15.5/src/Test.hs0000644000000000000000000001133012560222036013121 0ustar0000000000000000 module Test(main) where import Control.Exception import Control.Monad import Data.Maybe import System.Environment.Extra import General.Timing import Development.Shake.FileInfo import General.String import qualified Data.ByteString.Char8 as BS import Test.Type(sleepFileTimeCalibrate) import Control.Concurrent import qualified Test.Assume as Assume import qualified Test.Basic as Basic import qualified Test.Benchmark as Benchmark import qualified Test.C as C import qualified Test.Cache as Cache import qualified Test.Command as Command import qualified Test.Config as Config import qualified Test.Digest as Digest import qualified Test.Directory as Directory import qualified Test.Docs as Docs import qualified Test.Errors as Errors import qualified Test.Files as Files import qualified Test.FilePath as FilePath import qualified Test.FilePattern as FilePattern import qualified Test.Journal as Journal import qualified Test.Lint as Lint import qualified Test.Live as Live import qualified Test.Makefile as Makefile import qualified Test.Manual as Manual import qualified Test.Monad as Monad import qualified Test.Ninja as Ninja import qualified Test.Oracle as Oracle import qualified Test.OrderOnly as OrderOnly import qualified Test.Pool as Pool import qualified Test.Progress as Progress import qualified Test.Random as Random import qualified Test.Resources as Resources import qualified Test.Self as Self import qualified Test.Tar as Tar import qualified Test.Throttle as Throttle import qualified Test.Tup as Tup import qualified Test.Unicode as Unicode import qualified Test.Util as Util import qualified Test.Verbosity as Verbosity import qualified Test.Version as Version import qualified Run fakes = ["clean" * clean, "test" * test, "make" * makefile, "filetime" * filetime] where (*) = (,) mains = ["tar" * Tar.main, "self" * Self.main, "c" * C.main ,"basic" * Basic.main, "cache" * Cache.main, "command" * Command.main ,"config" * Config.main, "digest" * Digest.main, "directory" * Directory.main ,"docs" * Docs.main, "errors" * Errors.main, "orderonly" * OrderOnly.main ,"filepath" * FilePath.main, "filepattern" * FilePattern.main, "files" * Files.main ,"journal" * Journal.main, "lint" * Lint.main, "live" * Live.main, "makefile" * Makefile.main, "manual" * Manual.main ,"monad" * Monad.main, "pool" * Pool.main, "random" * Random.main, "ninja" * Ninja.main ,"resources" * Resources.main, "assume" * Assume.main, "benchmark" * Benchmark.main ,"oracle" * Oracle.main, "progress" * Progress.main, "unicode" * Unicode.main, "util" * Util.main ,"throttle" * Throttle.main, "verbosity" * Verbosity.main, "version" * Version.main, "tup" * Tup.main] where (*) = (,) main :: IO () main = do resetTimings xs <- getArgs exePath <- getExecutablePath case flip lookup (fakes ++ mains) =<< listToMaybe xs of _ | null xs -> do putStrLn "******************************************************************" putStrLn "** Running shake test suite, run with '--help' to see arguments **" putStrLn "******************************************************************" withArgs ["test"] main withArgs ["random","test","3m"] main Nothing -> putStrLn $ unlines ["Welcome to the Shake demo" ,"" ,unwords $ "Modes:" : map fst fakes ,unwords $ "Demos:" : map fst mains ,"" ,"As an example, try:" ,"" ,unwords [" ", exePath, "self", "--jobs=2", "--trace"] ,"" ,"Which will build Shake, using Shake, on 2 threads."] Just main -> main =<< sleepFileTimeCalibrate makefile :: IO () -> IO () makefile _ = do args <- getArgs withArgs (drop 1 args) Run.main filetime :: IO () -> IO () filetime _ = do args <- getArgs addTiming "Reading files" files <- fmap concat $ forM (drop 1 args) $ \file -> fmap (BS.lines . BS.filter (/= '\r')) $ BS.readFile file let n = length files evaluate n addTiming "Modtime" let (a,bcd) = splitAt (n `div` 4) files let (b,cd) = splitAt (n `div` 4) bcd let (c,d) = splitAt (n `div` 4) cd vars <- forM [a,b,c,d] $ \xs -> do mvar <- newEmptyMVar forkIO $ do mapM_ (getFileInfo . packU_) xs putMVar mvar () return $ takeMVar mvar sequence_ vars printTimings clean :: IO () -> IO () clean extra = sequence_ [withArgs [name,"clean"] $ main extra | (name,main) <- mains] test :: IO () -> IO () test yield = do args <- getArgs flip onException (putStrLn "TESTS FAILED") $ sequence_ [withArgs (name:"test":drop 1 args) $ test yield | (name,test) <- mains, name /= "random"] shake-0.15.5/src/Run.hs0000644000000000000000000000416312560222036012754 0ustar0000000000000000 module Run(main) where import Development.Make.All import Development.Ninja.All import System.Environment import Development.Shake import Development.Shake.FilePath import General.Timing import Control.Monad.Extra import Control.Exception.Extra import Data.Maybe import qualified System.Directory as IO import System.Console.GetOpt import System.Process import System.Exit main :: IO () main = do resetTimings args <- getArgs hsExe <- findFile [".shake" "shake" <.> exe ,"Shakefile.hs","Shakefile.lhs"] case hsExe of Just file -> do (prog,args) <- return $ if takeExtension file `elem` [".hs",".lhs"] then ("runhaskell", file:args) else (toNative file, args) e <- rawSystem prog args when (e /= ExitSuccess) $ exitWith e Nothing -> do withArgs ("--no-time":args) $ shakeArgsWith shakeOptions{shakeCreationCheck=False} flags $ \opts targets -> do let tool = listToMaybe [x | Tool x <- opts] makefile <- case reverse [x | UseMakefile x <- opts] of x:_ -> return x _ -> do res <- findFile ["makefile","Makefile","build.ninja"] case res of Just x -> return x Nothing -> errorIO "Could not find `makefile', `Makefile' or `build.ninja'" case () of _ | takeExtension makefile == ".ninja" -> runNinja makefile targets tool _ | isJust tool -> error "--tool flag is not supported without a .ninja Makefile" _ -> fmap Just $ runMakefile makefile targets data Flag = UseMakefile FilePath | Tool String flags = [Option "f" ["file","makefile"] (ReqArg (Right . UseMakefile) "FILE") "Read FILE as a makefile." ,Option "t" ["tool"] (ReqArg (Right . Tool) "TOOL") "Ninja-compatible tools." ] findFile :: [FilePath] -> IO (Maybe FilePath) findFile = findM (fmap (either (const False) id) . try_ . IO.doesFileExist) shake-0.15.5/src/Paths.hs0000644000000000000000000000136412560222036013267 0ustar0000000000000000-- | Fake cabal module for local building module Paths_shake where import Data.Version import System.IO.Unsafe import System.Directory import Control.Exception import Text.ParserCombinators.ReadP -- We want getDataFileName to be relative to the current directory even if -- we issue a change directory command. Therefore, first call caches, future ones read. curdir :: String curdir = unsafePerformIO getCurrentDirectory getDataFileName :: FilePath -> IO FilePath getDataFileName x = do evaluate curdir return $ curdir ++ "/" ++ x version :: Version -- can't write a literal Version value since in GHC 7.10 the versionsTag field is deprecated version = head $ [v | (v,"") <- readP_to_S parseVersion "0.0"] ++ error "version, failed to parse" shake-0.15.5/src/Test/0000755000000000000000000000000012560222036012567 5ustar0000000000000000shake-0.15.5/src/Test/Version.hs0000644000000000000000000000175712560222036014562 0ustar0000000000000000 module Test.Version(main) where import Development.Shake import Test.Type main = shaken test $ \args obj -> do want [obj "foo.txt"] obj "foo.txt" %> \file -> liftIO $ appendFile file "x" test build obj = do writeFile (obj "foo.txt") "" v1 <- getHashedShakeVersion [obj "foo.txt"] writeFile (obj "foo.txt") "y" v2 <- getHashedShakeVersion [obj "foo.txt"] assert (v1 /= v2) "Hashes must not be equal" build ["clean"] build [] assertContents (obj "foo.txt") "x" build ["--rule-version=new","--silent"] assertContents (obj "foo.txt") "xx" build ["--rule-version=new"] assertContents (obj "foo.txt") "xx" build ["--rule-version=extra","--silent"] assertContents (obj "foo.txt") "xxx" build ["--rule-version=more","--no-rule-version"] assertContents (obj "foo.txt") "xxx" build ["--rule-version=more"] assertContents (obj "foo.txt") "xxx" build ["--rule-version=final","--silent"] assertContents (obj "foo.txt") "xxxx" shake-0.15.5/src/Test/Verbosity.hs0000644000000000000000000000235412560222036015115 0ustar0000000000000000 module Test.Verbosity(main) where import Development.Shake import Test.Type main = shaken test $ \args obj -> do want $ map obj args obj "in.txt" %> \out -> do a <- getVerbosity b <- withVerbosity Normal $ getVerbosity writeFile' out $ unwords $ map show [a,b] obj "out.txt" %> \out -> do x <- getVerbosity ys <- withVerbosity Loud $ do a <- getVerbosity need [obj "in.txt"] -- make sure the inherited verbosity does not get passed along b <- getVerbosity c <- quietly getVerbosity d <- fmap shakeVerbosity getShakeOptions return [a,b,c,d] z <- getVerbosity writeFile' out $ unwords $ map show $ [x] ++ ys ++ [z] test build obj = do build ["out.txt","--clean"] assertContents (obj "in.txt") "Normal Normal" assertContents (obj "out.txt") "Normal Loud Loud Quiet Normal Normal" build ["out.txt","--clean","--verbose"] assertContents (obj "in.txt") "Loud Normal" assertContents (obj "out.txt") "Loud Loud Loud Quiet Loud Loud" build ["out.txt","--clean","--quiet"] assertContents (obj "in.txt") "Quiet Normal" assertContents (obj "out.txt") "Quiet Loud Loud Quiet Quiet Quiet" shake-0.15.5/src/Test/Util.hs0000644000000000000000000000145712560222036014047 0ustar0000000000000000 module Test.Util(main) where import Development.Shake.Util import Test.Type main = shaken test $ \args obj -> return () test build obj = do parseMakefile "" === [] parseMakefile "a:b c\ndef : ee" === [("a",["b","c"]),("def",["ee"])] parseMakefile "a: #comment\n#comment : b\nc : d" === [("a",[]),("c",["d"])] parseMakefile "a \\\n\t:b" === [("a",["b"])] parseMakefile "#comment\\ a : b" === [] parseMakefile "a: b c \\\n d e\n\nf:g" === [("a",["b","c","d","e"]),("f",["g"])] parseMakefile "foo/bar: \\\r\n c:/a1 \\\r\n x\r\n" === [("foo/bar",["c:/a1","x"])] parseMakefile "output.o: src/file/with\\ space.cpp" === [("output.o",["src/file/with space.cpp"])] parseMakefile "a: b\\ c" === [("a",["b ","c"])] parseMakefile "a: b\\ c\\ d e" === [("a",["b c d","e"])] shake-0.15.5/src/Test/Unicode.hs0000644000000000000000000000413312560222036014512 0ustar0000000000000000 module Test.Unicode(main) where import Development.Shake import Development.Shake.FilePath import Test.Type import Control.Exception.Extra import Control.Monad import System.Directory(createDirectoryIfMissing) -- | Decode a dull ASCII string to certain unicode points, necessary because -- withArgs (even the UTF8 version) throws an encoding error on the > 256 code points decode :: String -> String decode ('e':'^':xs) = '\xEA' : decode xs -- Latin Small Letter E With Circumflex decode (':':')':xs) = '\x263A' : decode xs -- White Smiling Face decode (x:xs) = x : decode xs decode [] = [] main = shaken test $ \xs obj -> do let pre:args = map decode xs want $ map obj args obj (pre ++ "dir/*") %> \out -> do let src = takeDirectory (takeDirectory out) takeFileName out copyFile' src out obj (pre ++ ".out") %> \out -> do a <- readFile' $ obj $ pre ++ "dir" pre <.> "source" b <- readFile' $ obj pre <.> "multi1" writeFile' out $ a ++ b map obj ["*.multi1","*.multi2"] &%> \[m1,m2] -> do b <- doesFileExist $ m1 -<.> "exist" writeFile' m1 $ show b writeFile' m2 $ show b test build obj = do build ["clean"] -- Useful, if the error message starts crashing... -- IO.hSetEncoding IO.stdout IO.char8 -- IO.hSetEncoding IO.stderr IO.char8 forM_ ["normal","e^",":)","e^-:)"] $ \pre -> do createDirectoryIfMissing True $ obj "" let ext x = obj $ decode pre <.> x res <- try_ $ writeFile (ext "source") "x" case res of Left err -> putStrLn $ "WARNING: Failed to write file " ++ pre ++ ", skipping unicode test (LANG=C ?)" Right _ -> do build [pre,pre <.> "out","--sleep"] assertContents (ext "out") $ "x" ++ "False" writeFile (ext "source") "y" build [pre,pre <.> "out","--sleep"] assertContents (ext "out") $ "y" ++ "False" writeFile (ext "exist") "" build [pre,pre <.> "out"] assertContents (ext "out") $ "y" ++ "True" shake-0.15.5/src/Test/Type.hs0000644000000000000000000002073512560222036014053 0ustar0000000000000000 module Test.Type(sleep, module Test.Type) where import Development.Shake hiding (copyFileChanged) import Development.Shake.Rule() -- ensure the module gets imported, and thus tested import General.String import Development.Shake.FileInfo import Development.Shake.FilePath import Control.Applicative import Control.Exception.Extra hiding (assert) import Control.Monad.Extra import Data.List import Data.Maybe import qualified Data.ByteString as BS import System.Directory as IO import System.Environment.Extra import System.Random import System.Console.GetOpt import System.IO.Extra as IO import System.Time.Extra import System.Info.Extra import Prelude shaken :: (([String] -> IO ()) -> (String -> String) -> IO ()) -> ([String] -> (String -> String) -> Rules ()) -> IO () -> IO () shaken test rules sleeper = do name:args <- getArgs when ("--sleep" `elem` args) sleeper putStrLn $ "## BUILD " ++ unwords (name:args) args <- return $ delete "--sleep" args let out = "output/" ++ name ++ "/" let obj x = if "/" `isPrefixOf` x then init out ++ x else out ++ x createDirectoryIfMissing True out case args of "test":extra -> do putStrLn $ "## TESTING " ++ name -- if the extra arguments are not --quiet/--loud it's probably going to go wrong test (\args -> withArgs (name:args ++ extra) $ shaken test rules sleeper) obj putStrLn $ "## FINISHED TESTING " ++ name "clean":_ -> removeDirectoryRecursive out {- "lint":args -> do let dbfile = out ++ ".database" tempfile = "output/" ++ name ++ ".database" b <- IO.doesFileExist dbfile when b $ renameFile dbfile tempfile removeDirectoryRecursive out createDirectoryIfMissing True out when b $ renameFile tempfile dbfile shake shakeOptions{shakeFiles=out, shakeLint=True} $ rules args (out++) -} "perturb":args -> forever $ do del <- removeFilesRandom out threads <- randomRIO (1,4) putStrLn $ "## TESTING PERTURBATION (" ++ show del ++ " files, " ++ show threads ++ " threads)" shake shakeOptions{shakeFiles=out, shakeThreads=threads, shakeVerbosity=Quiet} $ rules args (out++) args -> do let (_,files,_) = getOpt Permute [] args tracker <- hasTracker withArgs (args \\ files) $ shakeWithClean (removeDirectoryRecursive out) (shakeOptions{shakeFiles = out ,shakeReport = ["output/" ++ name ++ "/report.html"] ,shakeLint = Just $ if tracker then LintTracker else LintBasic }) -- if you have passed sleep, supress the "no errors" warning (do rules files obj; when ("--sleep" `elem` args) $ action $ return ()) shaken2 :: (([String] -> IO ()) -> (String -> String) -> IO ()) -> ([String] -> (String -> String) -> Rules ()) -> IO () -> IO () shaken2 test rules sleeper = shaken test rules2 sleeper where rules2 args obj = do (objd,args) <- return $ partition ("$" `isPrefixOf`) args (spec,phon) <- return $ partition ("!" `isPrefixOf`) args want $ phon ++ map (obj . tail) objd rules (map tail spec) obj hasTracker :: IO Bool hasTracker = isJust <$> if isWindows then findExecutable "tracker.exe" else lookupEnv "FSAT" shakeWithClean :: IO () -> ShakeOptions -> Rules () -> IO () shakeWithClean clean opts rules = shakeArgsWith opts [cleanOpt] f where cleanOpt = Option "c" ["clean"] (NoArg $ Right ()) "Clean before building." f extra files = do when (extra /= []) clean if "clean" `elem` files then clean >> return Nothing else return $ Just $ if null files then rules else want files >> withoutActions rules unobj :: FilePath -> FilePath unobj = dropDirectory1 . dropDirectory1 assert :: Bool -> String -> IO () assert b msg = unless b $ error $ "ASSERTION FAILED: " ++ msg infix 4 === (===) :: (Show a, Eq a) => a -> a -> IO () a === b = assert (a == b) $ "failed in ===\nLHS: " ++ show a ++ "\nRHS: " ++ show b assertExists :: FilePath -> IO () assertExists file = do b <- IO.doesFileExist file assert b $ "File was expected to exist, but is missing: " ++ file assertMissing :: FilePath -> IO () assertMissing file = do b <- IO.doesFileExist file assert (not b) $ "File was expected to be missing, but exists: " ++ file assertContents :: FilePath -> String -> IO () assertContents file want = do got <- IO.readFile' file assert (want == got) $ "File contents are wrong: " ++ file ++ "\nWANT: " ++ want ++ "\nGOT: " ++ got assertContentsOn :: (String -> String) -> FilePath -> String -> IO () assertContentsOn f file want = do got <- IO.readFile' file assert (f want == f got) $ "File contents are wrong: " ++ file ++ "\nWANT: " ++ want ++ "\nGOT: " ++ got ++ "\nWANT (transformed): " ++ f want ++ "\nGOT (transformed): " ++ f got assertContentsWords :: FilePath -> String -> IO () assertContentsWords = assertContentsOn (unwords . words) assertContentsInfix :: FilePath -> String -> IO () assertContentsInfix file want = do got <- IO.readFile' file assert (want `isInfixOf` got) $ "File contents are wrong: " ++ file ++ "\nWANT (anywhere): " ++ want ++ "\nGOT: " ++ got assertContentsUnordered :: FilePath -> [String] -> IO () assertContentsUnordered file xs = assertContentsOn (unlines . sort . lines) file (unlines xs) assertException :: [String] -> IO () -> IO () assertException parts act = do res <- try_ act case res of Left err -> let s = show err in forM_ parts $ \p -> assert (p `isInfixOf` s) $ "Incorrect exception, missing part:\nGOT: " ++ s ++ "\nWANTED: " ++ p Right _ -> error "Expected an exception but succeeded" noTest :: ([String] -> IO ()) -> (String -> String) -> IO () noTest build obj = do build ["--abbrev=output=$OUT","-j3"] build ["--no-build","--report=-"] build [] -- | Sleep long enough for the modification time resolution to catch up sleepFileTime :: IO () sleepFileTime = sleep 1 sleepFileTimeCalibrate :: IO (IO ()) sleepFileTimeCalibrate = do let file = "output/calibrate" createDirectoryIfMissing True $ takeDirectory file mtimes <- forM [1..10] $ \i -> fmap fst $ duration $ do writeFile file $ show i let time = fmap (fst . fromMaybe (error "File missing during sleepFileTimeCalibrate")) $ getFileInfo $ packU file t1 <- time flip loopM 0 $ \j -> do writeFile file $ show (i,j) t2 <- time return $ if t1 == t2 then Left $ j+1 else Right () putStrLn $ "Longest file modification time lag was " ++ show (ceiling (maximum mtimes * 1000)) ++ "ms" return $ sleep $ min 1 $ maximum mtimes * 2 removeFilesRandom :: FilePath -> IO Int removeFilesRandom x = do files <- getDirectoryContentsRecursive x n <- randomRIO (0,length files) rs <- replicateM (length files) (randomIO :: IO Double) mapM_ (removeFile . snd) $ sort $ zip rs files return n getDirectoryContentsRecursive :: FilePath -> IO [FilePath] getDirectoryContentsRecursive dir = do xs <- IO.getDirectoryContents dir (dirs,files) <- partitionM IO.doesDirectoryExist [dir x | x <- xs, not $ "." `isPrefixOf` x] rest <- concatMapM getDirectoryContentsRecursive dirs return $ files++rest copyDirectoryChanged :: FilePath -> FilePath -> IO () copyDirectoryChanged old new = do xs <- getDirectoryContentsRecursive old forM_ xs $ \from -> do let to = new drop (length $ addTrailingPathSeparator old) from createDirectoryIfMissing True $ takeDirectory to copyFileChanged from to copyFileChanged :: FilePath -> FilePath -> IO () copyFileChanged old new = do good <- IO.doesFileExist new good <- if not good then return False else liftM2 (==) (BS.readFile old) (BS.readFile new) when (not good) $ copyFile old new withTemporaryDirectory :: (FilePath -> IO ()) -> IO () withTemporaryDirectory act = do tdir <- getTemporaryDirectory bracket (openTempFile tdir "shake.hs") (removeFile . fst) $ \(file,h) -> do hClose h let dir = file ++ "_" bracket_ (createDirectory dir) (removeDirectoryRecursive dir) (act dir) skip :: Monad m => a -> m () skip x = return () shake-0.15.5/src/Test/Tup.hs0000644000000000000000000000257612560222036013705 0ustar0000000000000000 module Test.Tup(main) where import Development.Shake import Development.Shake.Config import Development.Shake.FilePath import Development.Shake.Util import Test.Type import Data.Maybe main = shaken noTest $ \args obj -> do -- Example inspired by http://gittup.org/tup/ex_multiple_directories.html usingConfigFile "src/Test/Tup/root.cfg" action $ do keys <- getConfigKeys need [x -<.> exe | x <- keys, takeExtension x == ".exe"] let objects dir key = do let f x | takeExtension x == ".c" = obj $ dir x -<.> "o" | takeExtension x == ".a" = obj $ takeBaseName x "lib" ++ x | otherwise = error $ "Unknown extension, " ++ x x <- fmap (fromMaybe $ error $ "Missing config key, " ++ key) $ getConfig key return $ map f $ words x (\x -> x -<.> exe == x) ?> \out -> do os <- objects "" $ takeBaseName out <.> "exe" need os cmd "gcc" os "-o" [out] obj "//lib*.a" %> \out -> do os <- objects (drop 3 $ takeBaseName out) $ drop 3 $ takeFileName out need os cmd "ar crs" [out] os obj "//*.o" %> \out -> do let src = "src/Test/Tup" unobj out -<.> "c" need [src] () <- cmd "gcc -c -MMD -MF" [out -<.> "d"] [src] "-o" [out] "-O2 -Wall -Isrc/Test/Tup/newmath" neededMakefileDependencies $ out -<.> "d" shake-0.15.5/src/Test/Throttle.hs0000644000000000000000000000226712560222036014737 0ustar0000000000000000 module Test.Throttle(main) where import Development.Shake import Development.Shake.FilePath import Test.Type import Control.Exception.Extra hiding (assert) import Control.Monad.Extra import System.Time.Extra main = shaken test $ \args obj -> do res <- newThrottle "test" 2 0.4 want $ map obj ["file1.1","file2.1","file3.2","file4.1","file5.2"] obj "*.*" %> \out -> do withResource res (read $ drop 1 $ takeExtension out) $ when (takeBaseName out == "file3") $ liftIO $ sleep 0.2 writeFile' out "" test build obj = do forM_ [[],["-j8"]] $ \flags -> do -- we are sometimes over the window if the machine is "a bit loaded" at some particular time -- therefore we rerun the test three times, and only fail if it fails on all of them retry 3 $ do build ["clean"] (s, _) <- duration $ build ["--no-report"] -- the 0.1s cap is a guess at an upper bound for how long everything else should take -- and should be raised on slower machines assert (s >= 1.4 && s < 1.6) $ "Bad throttling, expected to take 1.4s + computation time (cap of 0.2s), took " ++ show s ++ "s" shake-0.15.5/src/Test/Tar.hs0000644000000000000000000000044512560222036013654 0ustar0000000000000000 module Test.Tar(main) where import Development.Shake import Test.Type main = shaken noTest $ \args obj -> do want [obj "result.tar"] obj "result.tar" %> \out -> do contents <- readFileLines "src/Test/Tar/list.txt" need contents cmd "tar -cf" [out] contents shake-0.15.5/src/Test/Self.hs0000644000000000000000000000621012560222036014013 0ustar0000000000000000{-# LANGUAGE GeneralizedNewtypeDeriving, DeriveDataTypeable #-} module Test.Self(main) where import Development.Shake import Development.Shake.Classes import Development.Shake.FilePath import Test.Type import Control.Monad import Data.Char import Data.List.Extra import System.Info newtype GhcPkg = GhcPkg () deriving (Show,Typeable,Eq,Hashable,Binary,NFData) newtype GhcFlags = GhcFlags () deriving (Show,Typeable,Eq,Hashable,Binary,NFData) main = shaken noTest $ \args obj -> do let moduleToFile ext xs = map (\x -> if x == '.' then '/' else x) xs <.> ext want $ if null args then [obj "Main" <.> exe] else args -- fixup to cope with Cabal's generated files let fixPaths x = if x == "Paths_shake.hs" then "Paths.hs" else x ghcPkg <- addOracle $ \GhcPkg{} -> do Stdout out <- quietly $ cmd "ghc-pkg list --simple-output" return $ words out ghcFlags <- addOracle $ \GhcFlags{} -> do pkgs <- readFileLines $ obj ".pkgs" return $ map ("-package=" ++) pkgs let ghc args = do -- since ghc-pkg includes the ghc package, it changes if the version does ghcPkg $ GhcPkg () flags <- ghcFlags $ GhcFlags () cmd "ghc" flags args obj "Main" <.> exe %> \out -> do src <- readFileLines $ obj "Run.deps" let os = map (obj . moduleToFile "o") $ "Run" : src need os ghc $ ["-o",out] ++ os obj "//*.deps" %> \out -> do dep <- readFileLines $ out -<.> "dep" let xs = map (obj . moduleToFile "deps") dep need xs ds <- fmap (nubOrd . sort . (++) dep . concat) $ mapM readFileLines xs writeFileLines out ds obj "//*.dep" %> \out -> do src <- readFile' $ "src" fixPaths (unobj $ out -<.> "hs") let xs = hsImports src xs <- filterM (doesFileExist . ("src" ) . fixPaths . moduleToFile "hs") xs writeFileLines out xs [obj "//*.o",obj "//*.hi"] &%> \[out,_] -> do deps <- readFileLines $ out -<.> "deps" let hs = "src" fixPaths (unobj $ out -<.> "hs") need $ hs : map (obj . moduleToFile "hi") deps ghc $ ["-c",hs,"-isrc","-main-is","Run.main" ,"-hide-all-packages","-odir=output/self","-hidir=output/self","-i=output/self"] ++ ["-DPORTABLE","-fwarn-unused-imports"] -- to test one CPP branch obj ".pkgs" %> \out -> do src <- readFile' "shake.cabal" writeFileLines out $ sort $ cabalBuildDepends src --------------------------------------------------------------------- -- GRAB INFORMATION FROM FILES hsImports :: String -> [String] hsImports xs = [ takeWhile (\x -> isAlphaNum x || x `elem` "._") $ dropWhile (not . isUpper) x | x <- concatMap (wordsBy (== ';')) $ lines xs, "import " `isPrefixOf` trim x] -- FIXME: Should actually parse the list from the contents of the .cabal file cabalBuildDepends :: String -> [String] cabalBuildDepends _ = packages ++ ["unix" | os /= "mingw32"] packages = words $ "base transformers binary unordered-containers hashable time old-time bytestring " ++ "filepath directory process deepseq random utf8-string extra js-jquery js-flot" shake-0.15.5/src/Test/Resources.hs0000644000000000000000000000166412560222036015104 0ustar0000000000000000 module Test.Resources(main) where import Development.Shake import Test.Type import Control.Monad import Data.IORef main extra = do let cap = 2 ref <- newIORef 0 flip (shaken test) extra $ \args obj -> do want $ map obj ["file1.txt","file2.txt","file3.txt","file4.txt"] res <- newResource "test" cap res2 <- newResource "test" cap unless (res < res2 || res2 < res) $ error "Resources should have a good ordering" obj "*.txt" %> \out -> withResource res 1 $ do old <- liftIO $ atomicModifyIORef ref $ \i -> (i+1,i) when (old >= cap) $ error "Too many resources in use at one time" liftIO $ sleep 0.1 liftIO $ atomicModifyIORef ref $ \i -> (i-1,i) writeFile' out "" test build obj = do build ["clean"] build ["-j2"] build ["clean"] build ["-j4"] build ["clean"] build ["-j10"] shake-0.15.5/src/Test/Random.hs0000644000000000000000000001152012560222036014342 0ustar0000000000000000{-# LANGUAGE PatternGuards #-} module Test.Random(main) where import Development.Shake import Numeric.Extra import Test.Type import Control.Exception.Extra import Control.Monad import Data.List import Data.Maybe import System.Environment import System.Exit import System.Random import General.Extra import qualified System.IO.Extra as IO import System.Time.Extra inputRange = [1..10] data Value = Single Int | Multiple [[Value]] deriving (Read,Show,Eq) data Source = Input Int | Output Int | Bang deriving (Read,Show) data Logic = Logic Int [[Source]] | Want [Int] deriving (Read,Show) main = shaken test $ \args obj -> do let toFile (Input i) = obj $ "input-" ++ show i ++ ".txt" toFile (Output i) = obj $ "output-" ++ show i ++ ".txt" toFile Bang = error "BANG" let randomSleep = liftIO $ do i <- randomRIO (0, 25) sleep $ intToDouble i / 100 forM_ (map read $ filter (isNothing . asDuration) args) $ \x -> case x of Want xs -> want $ map (toFile . Output) xs Logic out srcs -> toFile (Output out) %> \out -> do res <- fmap (show . Multiple) $ forM srcs $ \src -> do randomSleep need $ map toFile src mapM (liftIO . fmap read . IO.readFile' . toFile) src randomSleep writeFileChanged out res asDuration :: String -> Maybe Double asDuration x | "s" `isSuffixOf` x, [(i,"")] <- reads $ init x = Just i | "m" `isSuffixOf` x, [(i,"")] <- reads $ init x = Just $ i * 60 | otherwise = Nothing test build obj = do limit <- do args <- getArgs let bound = listToMaybe $ reverse $ mapMaybe asDuration args time <- offsetTime return $ when (isJust bound) $ do now <- time when (now > fromJust bound) exitSuccess forM_ [1..] $ \count -> do limit putStrLn $ "* PERFORMING RANDOM TEST " ++ show count build ["clean"] build [] -- to create the directory forM_ inputRange $ \i -> writeFile (obj $ "input-" ++ show i ++ ".txt") $ show $ Single i logic <- randomLogic runLogic [] logic chng <- filterM (const randomIO) inputRange forM_ chng $ \i -> writeFile (obj $ "input-" ++ show i ++ ".txt") $ show $ Single $ negate i runLogic chng logic forM_ inputRange $ \i -> writeFile (obj $ "input-" ++ show i ++ ".txt") $ show $ Single i logicBang <- addBang =<< addBang logic j <- randomRIO (1::Int,8) res <- try_ $ build $ "--exception" : ("-j" ++ show j) : map show (logicBang ++ [Want [i | Logic i _ <- logicBang]]) case res of Left err | "BANG" `isInfixOf` show err -> return () -- error I expected | otherwise -> error $ "UNEXPECTED ERROR: " ++ show err _ -> return () -- occasionally we only put BANG in places with no dependenies that don't get rebuilt runLogic [] $ logic ++ [Want [i | Logic i _ <- logic]] where runLogic :: [Int] -> [Logic] -> IO () runLogic negated xs = do let poss = [i | Logic i _ <- xs] i <- randomRIO (0, 7) wants <- replicateM i $ do i <- randomRIO (0, 5) replicateM i $ randomElem poss sleepFileTime j <- randomRIO (1::Int,8) build $ ("-j" ++ show j) : map show (xs ++ map Want wants) let value i = case [ys | Logic j ys <- xs, j == i] of [ys] -> Multiple $ flip map ys $ map $ \i -> case i of Input i -> Single $ if i `elem` negated then negate i else i Output i -> value i forM_ (concat wants) $ \i -> do let wanted = value i got <- fmap read $ IO.readFile' $ obj $ "output-" ++ show i ++ ".txt" when (wanted /= got) $ error $ "INCORRECT VALUE for " ++ show i addBang :: [Logic] -> IO [Logic] addBang xs = do i <- randomRIO (0, length xs - 1) let (before,now:after) = splitAt i xs now <- f now return $ before ++ now : after where f (Logic log xs) = do i <- randomRIO (0, length xs) let (before,after) = splitAt i xs return $ Logic log $ before ++ [Bang] : after randomLogic :: IO [Logic] -- only Logic constructors randomLogic = do rules <- randomRIO (1,100) f rules $ map Input inputRange where f 0 avail = return [] f i avail = do needs <- randomRIO (0,3) xs <- replicateM needs $ do ns <- randomRIO (0,3) replicateM ns $ randomElem avail let r = Logic i xs fmap (r:) $ f (i-1) (Output i:avail) shake-0.15.5/src/Test/Progress.hs0000644000000000000000000000415012560222036014727 0ustar0000000000000000{-# LANGUAGE DeriveDataTypeable #-} module Test.Progress(main) where import Development.Shake.Progress import Test.Type import System.Directory import System.FilePath import Data.Monoid import Prelude main = shaken test $ \args obj -> return () -- | Given a list of todo times, get out a list of how long is predicted prog = progEx 10000000000000000 progEx :: Double -> [Double] -> IO [Double] progEx mxDone todo = do let resolution = 10000 -- Use resolution to get extra detail on the numbers let done = scanl (+) 0 $ map (min mxDone . max 0) $ zipWith (-) todo (tail todo) let res = progressReplay $ zip (map (*resolution) [1..]) $ tail $ zipWith (\t d -> mempty{timeBuilt=d*resolution,timeTodo=(t*resolution,0)}) todo done return $ (0/0) : map ((/ resolution) . actualSecs) res test build obj = do -- perfect functions should match perfectly xs <- prog [10,9..1] drop 2 xs === [8,7..1] xs <- prog $ map (*5) [10,9..1] drop 2 xs === [8,7..1] xs <- prog $ map (*0.2) [10,9..1] let dp3 x = fromIntegral (round $ x * 1000 :: Int) / 1000 map dp3 (drop 2 xs) === [8,7..1] -- The properties below this line could be weakened -- increasing functions can't match xs <- prog [5,6,7] last xs === 7 -- the first value must be plausible, or missing xs <- prog [187] assert (isNaN $ head xs) "No first value" -- desirable properties, could be weakened xs <- progEx 2 $ 100:map (*2) [10,9..1] drop 5 xs === [6,5..1] xs <- progEx 1 $ [10,9,100,8,7,6,5,4,3,2,1] assert (all (<= 1.5) $ map abs $ zipWith (-) (drop 5 xs) [6,5..1]) "Close" -- if no progress is made, don't keep the time going up xs <- prog [10,9,8,7,7,7,7,7] drop 5 xs === [7,7,7] -- if the work rate changes, should somewhat reflect that xs <- prog [10,9,8,7,6.5,6,5.5,5] assert (last xs > 7.1) "Some discounting (factor=0 would give 7)" xs <- getDirectoryContents "src/Test/Progress" build $ ["--progress=replay=src/Test/Progress/" ++ x | x <- xs, takeExtension x == ".prog"] ++ ["--no-report","--report=-","--report=" ++ obj "progress.html"] shake-0.15.5/src/Test/Pool.hs0000644000000000000000000000435012560222036014036 0ustar0000000000000000 module Test.Pool(main) where import Test.Type import Development.Shake.Pool import Control.Concurrent import Control.Exception hiding (assert) import Control.Monad main = shaken test $ \args obj -> return () test build obj = do let wait = sleep 0.01 forM_ [False,True] $ \deterministic -> do -- check that it aims for exactly the limit forM_ [1..6] $ \n -> do var <- newMVar (0,0) -- (maximum, current) runPool deterministic n $ \pool -> forM_ [1..5] $ \i -> addPool pool $ do modifyMVar_ var $ \(mx,now) -> return (max (now+1) mx, now+1) wait modifyMVar_ var $ \(mx,now) -> return (mx,now-1) res <- takeMVar var res === (min n 5, 0) -- check that exceptions are immediate self <- myThreadId handle (\(ErrorCall msg) -> msg === "pass") $ runPool deterministic 3 $ \pool -> do addPool pool $ do wait error "pass" addPool pool $ do wait >> wait throwTo self $ ErrorCall "fail" wait >> wait -- give chance for a delayed exception -- check someone spawned when at zero todo still gets run done <- newMVar False runPool deterministic 1 $ \pool -> addPool pool $ do wait addPool pool $ do wait modifyMVar_ done $ const $ return True done <- readMVar done assert done "Waiting on someone" -- check that killing a thread pool stops the tasks, bug 545 thread <- newEmptyMVar done <- newEmptyMVar res <- newMVar True t <- forkIO $ finally (putMVar done ()) $ runPool deterministic 1 $ \pool -> addPool pool $ do t <- takeMVar thread killThread t wait -- allow the thread to die first modifyMVar_ res (const $ return False) putMVar thread t takeMVar done wait >> wait >> wait -- allow the bad thread to continue res <- readMVar res assert res "Early termination" shake-0.15.5/src/Test/OrderOnly.hs0000644000000000000000000000332612560222036015044 0ustar0000000000000000 module Test.OrderOnly(main) where import Development.Shake import Test.Type import System.Directory(removeFile) import Control.Exception.Extra main = shaken test $ \args obj -> do want $ map obj args obj "bar.txt" %> \out -> do alwaysRerun writeFile' out =<< liftIO (readFile $ obj "bar.in") obj "foo.txt" %> \out -> do let src = obj "bar.txt" orderOnly [src] writeFile' out =<< liftIO (readFile src) need [src] obj "baz.txt" %> \out -> do let src = obj "bar.txt" orderOnly [src] liftIO $ appendFile out "x" obj "primary.txt" %> \out -> do need [obj "source.txt"] orderOnly [obj "intermediate.txt"] writeFile' out =<< liftIO (readFile $ obj "intermediate.txt") obj "intermediate.txt" %> \out -> do copyFile' (obj "source.txt") out test build obj = do writeFile (obj "bar.in") "in" build ["foo.txt","--sleep"] assertContents (obj "foo.txt") "in" writeFile (obj "bar.in") "out" build ["foo.txt","--sleep"] assertContents (obj "foo.txt") "out" writeFile (obj "baz.txt") "" writeFile (obj "bar.in") "in" build ["baz.txt","--sleep"] assertContents (obj "baz.txt") "x" writeFile (obj "bar.in") "out" build ["baz.txt"] assertContents (obj "baz.txt") "x" ignore $ removeFile $ obj "intermediate.txt" writeFile (obj "source.txt") "x" build ["primary.txt","--sleep"] assertContents (obj "intermediate.txt") "x" removeFile $ obj "intermediate.txt" build ["primary.txt","--sleep"] assertMissing $ obj "intermediate.txt" writeFile (obj "source.txt") "y" build ["primary.txt","--sleep"] assertContents (obj "intermediate.txt") "y" shake-0.15.5/src/Test/Oracle.hs0000644000000000000000000000545112560222036014335 0ustar0000000000000000{-# LANGUAGE PatternGuards #-} module Test.Oracle(main) where import Development.Shake import Test.Type import Control.Monad main = shaken test $ \args obj -> do let f name lhs rhs = (,) name $ (do addOracle $ \k -> let _ = k `asTypeOf` lhs in return rhs; return () ,let o = obj name ++ ".txt" in do want [o]; o %> \_ -> do v <- askOracleWith lhs rhs; writeFile' o $ show v) let tbl = [f "str-bool" "" True ,f "str-int" "" (0::Int) ,f "bool-str" True "" ,f "int-str" (0::Int) ""] forM_ args $ \a -> case a of '+':x | Just (add,_) <- lookup x tbl -> add '*':x | Just (_,use) <- lookup x tbl -> use '@':key -> do addOracle $ \() -> return key; return () '%':name -> let o = obj "unit.txt" in do want [o]; o %> \_ -> do {askOracleWith () ""; writeFile' o name} '!':name -> do want [obj "rerun"]; obj "rerun" %> \out -> do alwaysRerun; writeFile' out name test build obj = do build ["clean"] -- check it rebuilds when it should build ["@key","%name"] assertContents (obj "unit.txt") "name" build ["@key","%test"] assertContents (obj "unit.txt") "name" build ["@foo","%test"] assertContents (obj "unit.txt") "test" -- check adding/removing redundant oracles does not trigger a rebuild build ["@foo","%newer","+str-bool"] assertContents (obj "unit.txt") "test" build ["@foo","%newer","+str-int"] assertContents (obj "unit.txt") "test" build ["@foo","%newer"] assertContents (obj "unit.txt") "test" -- check always run works build ["!foo"] assertContents (obj "rerun") "foo" build ["!bar"] assertContents (obj "rerun") "bar" -- check error messages are good let errors args err = assertException [err] $ build $ "--quiet" : args build ["+str-int","*str-int"] errors ["*str-int"] -- Building with an an Oracle that has been removed "missing a call to addOracle" errors ["*str-bool"] -- Building with an Oracle that I know nothing about "missing a call to addOracle" build ["+str-int","*str-int"] errors ["+str-bool","*str-int"] -- Building with an Oracle that has changed type "askOracle is used at the wrong type" errors ["+str-int","+str-bool"] -- Two Oracles with the same question type "Only one call to addOracle is allowed" errors ["+str-int","*str-bool"] -- Using an Oracle at the wrong answer type "askOracle is used at the wrong type" build ["+str-int","+str-int"] -- Two Oracles work if they aren't used errors ["+str-int","+str-int","*str-int"] -- Two Oracles fail if they are used "Only one call to addOracle is allowed" errors ["+str-int","+str-bool"] -- Two Oracles with the same answer type "Only one call to addOracle is allowed" shake-0.15.5/src/Test/Ninja.hs0000644000000000000000000001050412560222036014162 0ustar0000000000000000 module Test.Ninja(main) where import Development.Shake import qualified Development.Shake.Config as Config import System.Directory(copyFile, createDirectoryIfMissing, removeFile) import Control.Monad import Test.Type import qualified Data.HashMap.Strict as Map import Data.List import Data.Maybe import System.IO.Extra import qualified Run import System.Environment main = shaken test $ \args obj -> do let args2 = ("-C" ++ obj "") : map tail (filter ("@" `isPrefixOf`) args) let real = "real" `elem` args action $ if real then cmd "ninja" args2 else liftIO $ withArgs args2 Run.main test build obj = do -- when calling run anything with a leading @ gets given to Shake, anything without gets given to Ninja let run xs = build $ "--exception" : map (\x -> fromMaybe ('@':x) $ stripPrefix "@" x) (words xs) let runFail xs bad = assertException [bad] $ run $ xs ++ " --quiet" build ["clean"] run "-f../../src/Test/Ninja/test1.ninja" assertExists $ obj "out1.txt" run "-f../../src/Test/Ninja/test2.ninja" assertExists $ obj "out2.2" assertMissing $ obj "out2.1" build ["clean"] run "-f../../src/Test/Ninja/test2.ninja out2.1" assertExists $ obj "out2.1" assertMissing $ obj "out2.2" copyFile "src/Test/Ninja/test3-sub.ninja" $ obj "test3-sub.ninja" copyFile "src/Test/Ninja/test3-inc.ninja" $ obj "test3-inc.ninja" createDirectoryIfMissing True $ obj "subdir" copyFile "src/Test/Ninja/subdir/1.ninja" $ obj "subdir/1.ninja" copyFile "src/Test/Ninja/subdir/2.ninja" $ obj "subdir/2.ninja" run "-f../../src/Test/Ninja/test3.ninja" assertContentsWords (obj "out3.1") "g4+b1+++i1" assertContentsWords (obj "out3.2") "g4++++i1" assertContentsWords (obj "out3.3") "g4++++i1" assertContentsWords (obj "out3.4") "g4+++s1+s2" run "-f../../src/Test/Ninja/test4.ninja out" assertExists $ obj "out.txt" assertExists $ obj "out2.txt" run "-f../../src/Test/Ninja/test5.ninja" assertExists $ obj "output file" writeFile (obj "nocreate.log") "" writeFile (obj "nocreate.in") "" run "-f../../src/Test/Ninja/nocreate.ninja" assertContentsWords (obj "nocreate.log") "x" run "-f../../src/Test/Ninja/nocreate.ninja" run "-f../../src/Test/Ninja/nocreate.ninja" assertContentsWords (obj "nocreate.log") "x x x" writeFile (obj "input") "" runFail "-f../../src/Test/Ninja/lint.ninja bad --lint" "'needed' file required rebuilding" run "-f../../src/Test/Ninja/lint.ninja good --lint" runFail "-f../../src/Test/Ninja/lint.ninja bad --lint" "not a pre-dependency" res <- fmap (drop 1 . lines . fst) $ captureOutput $ run "-f../../src/Test/Ninja/compdb.ninja -t compdb cxx @--no-report @--quiet" want <- fmap lines $ readFile "src/Test/Ninja/compdb.output" let eq a b | (a1,'*':a2) <- break (== '*') a = unless (a1 `isPrefixOf` b && a2 `isSuffixOf` b) $ a === b | otherwise = a === b length want === length res zipWithM_ eq want res -- Test initial variable bindings and variables in include/subninja statements let test6 = obj "test6" copyFile "src/Test/Ninja/test6-sub.ninja" $ test6 ++ "-sub.ninja" copyFile "src/Test/Ninja/test6-inc.ninja" $ test6 ++ "-inc.ninja" copyFile "src/Test/Ninja/test6.ninja" $ test6 ++ ".ninja" config <- Config.readConfigFileWithEnv [("v1", test6)] $ test6 ++ ".ninja" -- The file included by subninja should have a separate variable scope Map.lookup "v2" config === Just "g2" run "-f../../src/Test/Ninja/phonyorder.ninja bar.txt" -- tests from ninjasmith: https://github.com/ndmitchell/ninjasmith/ run "-f../../src/Test/Ninja/redefine.ninja" assertContentsWords (obj "redefine.txt") "version3 version2" run "-f../../src/Test/Ninja/buildseparate.ninja" assertContentsWords (obj "buildseparate.txt") "XX" when False $ do -- currently fails because Shake doesn't match Ninja here run "-f../../src/Test/Ninja/outputtouch.ninja" assertContentsWords (obj "outputtouch.txt") "hello" writeFile (obj "outputtouch.txt") "goodbye" run "-f../../src/Test/Ninja/outputtouch.ninja" assertContentsWords (obj "outputtouch.txt") "goodbye" removeFile (obj "outputtouch.txt") run "-f../../src/Test/Ninja/outputtouch.ninja" assertContentsWords (obj "outputtouch.txt") "hello" shake-0.15.5/src/Test/Monad.hs0000644000000000000000000000541312560222036014164 0ustar0000000000000000 module Test.Monad(main) where import Test.Type import Development.Shake.Monad import Data.IORef import Control.Concurrent import Control.Exception hiding (assert) import Control.Monad import Control.Monad.IO.Class main = shaken test $ \args obj -> return () run :: ro -> rw -> RAW ro rw a -> IO a run ro rw m = do res <- newEmptyMVar runRAW ro rw m $ void . tryPutMVar res either throwIO return =<< readMVar res test build obj = do let conv x = either (Left . fromException) Right x :: Either (Maybe ArithException) Int let dump ro rw = do liftIO . (=== ro) =<< getRO; liftIO . (=== rw) =<< getRW -- test the basics plus exception handling run 1 "test" $ do dump 1 "test" putRW "more" dump 1 "more" res <- tryRAW $ withRO (+3) $ do dump 4 "more" withRW (++ "x") $ do dump 4 "morex" dump 4 "more" return 100 liftIO $ conv res === Right 100 dump 1 "more" putRW "new" dump 1 "new" res <- tryRAW $ withRO (+2) $ do dump 3 "new" withRW (++ "x") $ do dump 3 "newx" throwRAW Overflow error "Should not have reached here" return 9 liftIO $ conv res === Left (Just Overflow) dump 1 "new" catchRAW (catchRAW (throwRAW Overflow) $ \_ -> modifyRW (++ "x")) $ \_ -> modifyRW (++ "y") dump 1 "newx" catchRAW (catchRAW (throwRAW Overflow) $ \e -> modifyRW (++ "x") >> throwRAW e) $ \_ -> modifyRW (++ "y") dump 1 "newxxy" -- test capture run 1 "test" $ do i <- captureRAW $ \k -> k $ Right 1 liftIO $ i === 1 i <- tryRAW $ captureRAW $ \k -> k $ Left $ toException Overflow liftIO $ conv i === Left (Just Overflow) captureRAW $ \k -> k $ Right () i <- tryRAW $ throwRAW Underflow liftIO $ conv i === Left (Just Underflow) -- catch does not scope too far res <- try $ run 1 "test" $ fmap (either show id) $ tryRAW $ captureRAW $ \k -> throwIO Overflow res === Left Overflow res <- try $ run 1 "test" $ do captureRAW $ \k -> throwIO Overflow return "x" res === Left Overflow -- catch works properly if continuation called multiple times ref <- newIORef [] run 1 "test" $ flip catchRAW (const $ liftIO $ modifyIORef ref ('x':)) $ do captureRAW $ \k -> do k $ Right () k $ Right () k $ Left $ toException Overflow k $ Right () k $ Left $ toException Overflow flip catchRAW (const $ liftIO $ modifyIORef ref ('y':)) $ throwRAW $ toException Overflow (===) "xyxyy" =<< readIORef ref shake-0.15.5/src/Test/Manual.hs0000644000000000000000000000157412560222036014347 0ustar0000000000000000 module Test.Manual(main) where import Development.Shake hiding (copyFileChanged) import Development.Shake.FilePath import Test.Type import System.Info.Extra main = shaken test $ \args obj -> action $ liftIO $ error "The 'manual' example should only be used in test mode" test build obj = do copyDirectoryChanged "docs/manual" $ obj "manual" copyDirectoryChanged "src/Development" $ obj "manual/Development" copyDirectoryChanged "src/General" $ obj "manual/General" copyFileChanged "src/Paths.hs" $ obj "manual/Paths_shake.hs" let cmdline = if isWindows then "build.bat" else "/bin/sh build.sh" () <- cmd [Cwd $ obj "manual", Shell] cmdline "-j2" assertExists $ obj "manual/_build/run" <.> exe () <- cmd [Cwd $ obj "manual", Shell] cmdline () <- cmd [Cwd $ obj "manual", Shell] [cmdline,"clean"] assertMissing $ obj "manual/_build/run" <.> exe shake-0.15.5/src/Test/Makefile.hs0000644000000000000000000000177712560222036014654 0ustar0000000000000000 module Test.Makefile(main) where import Development.Shake(action, liftIO) import qualified Run as Makefile import System.Environment import Test.Type import Control.Monad import Data.List import Data.Maybe main = shaken test $ \args obj -> action $ liftIO $ do unless (["@@"] `isPrefixOf` args) $ error "The 'makefile' example should only be used in test mode, to test using a makefile use the 'make' example." withArgs [fromMaybe x $ stripPrefix "@" x | x <- drop 1 args] Makefile.main test build obj = do copyDirectoryChanged "src/Test/MakeTutor" $ obj "MakeTutor" build ["@@","--directory=" ++ obj "MakeTutor","--no-report"] build ["@@","--directory=" ++ obj "MakeTutor","--no-report"] build ["@@","--directory=" ++ obj "MakeTutor","@clean","--no-report"] writeFile (obj "output.txt") "goodbye" writeFile (obj "Shakefile.hs") "main = writeFile \"output.txt\" \"hello\"" build ["@@","--directory=" ++ obj ""] assertContents (obj "output.txt") "hello" shake-0.15.5/src/Test/Live.hs0000644000000000000000000000150612560222036014024 0ustar0000000000000000 module Test.Live(main) where import Development.Shake import Test.Type main = shaken test $ \args obj -> do want $ map obj args obj "foo" %> \ out -> do need [obj "bar"] writeFile' out "" obj "bar" %> \out -> writeFile' out "" obj "baz" %> \out -> writeFile' out "" test build obj = do build ["clean"] build ["foo","baz","--live=" ++ obj "live.txt"] assertContentsUnordered (obj "live.txt") $ map obj $ words "foo bar baz" build ["foo","baz","--live=" ++ obj "live.txt"] assertContentsUnordered (obj "live.txt") $ map obj $ words "foo bar baz" build ["foo","--live=" ++ obj "live.txt"] assertContentsUnordered (obj "live.txt") $ map obj $ words "foo bar" build ["bar","--live=" ++ obj "live.txt"] assertContentsUnordered (obj "live.txt") $ map obj $ words "bar" shake-0.15.5/src/Test/Lint.hs0000644000000000000000000001066412560222036014040 0ustar0000000000000000 module Test.Lint(main) where import Development.Shake import Development.Shake.FilePath import Test.Type import Control.Exception hiding (assert) import System.Directory as IO import System.Info.Extra import Control.Monad.Extra main = shaken test $ \args obj -> do want $ map obj args addOracle $ \() -> do liftIO $ createDirectoryIfMissing True $ obj "dir" liftIO $ setCurrentDirectory $ obj "dir" return () obj "changedir" %> \out -> do () <- askOracle () writeFile' out "" obj "pause.*" %> \out -> do liftIO $ sleep 0.1 need [obj "cdir" <.> takeExtension out] writeFile' out "" obj "cdir.*" %> \out -> do pwd <- liftIO getCurrentDirectory let dir2 = obj $ "dir" ++ takeExtension out liftIO $ createDirectoryIfMissing True dir2 liftIO $ setCurrentDirectory dir2 liftIO $ sleep 0.2 liftIO $ setCurrentDirectory pwd writeFile' out "" obj "createonce" %> \out -> do writeFile' out "X" obj "createtwice" %> \out -> do need [obj "createonce"] liftIO sleepFileTime writeFile' (obj "createonce") "Y" writeFile' out "" obj "listing" %> \out -> do writeFile' (out <.> "ls1") "" getDirectoryFiles (obj "") ["//*.ls*"] writeFile' (out <.> "ls2") "" writeFile' out "" obj "existance" %> \out -> do Development.Shake.doesFileExist $ obj "exists" writeFile' (obj "exists") "" writeFile' out "" obj "gen*" %> \out -> do writeFile' out out obj "needed1" %> \out -> do needed [obj "gen1"] writeFile' out "" obj "needed2" %> \out -> do orderOnly [obj "gen2"] needed [obj "gen2"] writeFile' out "" obj "tracker-write1" %> \out -> do gen "x" $ out <.> "txt" need [out <.> "txt"] writeFile' out "" obj "tracker-write2" %> \out -> do gen "x" $ out <.> "txt" writeFile' out "" obj "tracker-source2" %> \out -> copyFile' (obj "tracker-source1") out obj "tracker-read1" %> \out -> do access $ toNative (obj "tracker-source1") writeFile' out "" obj "tracker-read2" %> \out -> do access $ toNative (obj "tracker-source1") need [obj "tracker-source1"] writeFile' out "" obj "tracker-read3" %> \out -> do access $ toNative (obj "tracker-source2") need [obj "tracker-source2"] writeFile' out "" obj "tracker-compile.o" %> \out -> do need [obj "tracker-source.c", obj "tracker-source.h"] cmd "gcc" ["-c", obj "tracker-source.c", "-o", out] where gen t f = unit $ if isWindows then cmd "cmd /c" ["echo" ++ t ++ " > " ++ f] else cmd Shell "echo" ["x", ">", f] access f = unit $ if isWindows then cmd "cmd /c" ["type " ++ f ++ " > nul"] else cmd Shell "cat" [f, ">/dev/null"] test build obj = do dir <- getCurrentDirectory let crash args parts = do assertException parts (build $ "--quiet" : args) `finally` setCurrentDirectory dir crash ["changedir"] ["current directory has changed"] build ["cdir.1","cdir.2","-j1"] build ["--clean","cdir.1","pause.2","-j1"] crash ["--clean","cdir.1","pause.2","-j2"] ["before building output/lint/","current directory has changed"] crash ["existance"] ["changed since being depended upon"] crash ["createtwice"] ["changed since being depended upon"] crash ["listing"] ["changed since being depended upon","output/lint"] crash ["--clean","listing","existance"] ["changed since being depended upon"] crash ["needed1"] ["'needed' file required rebuilding"] build ["needed2"] when False $ whenM hasTracker $ do writeFile (obj "tracker-source1") "" writeFile (obj "tracker-source2") "" writeFile (obj "tracker-source.c") "#include \n#include \"tracker-source.h\"\n" writeFile (obj "tracker-source.h") "" crash ["tracker-write1"] ["not have its creation tracked","lint/tracker-write1","lint/tracker-write1.txt"] build ["tracker-write2"] crash ["tracker-read1"] ["used but not depended upon","lint/tracker-source1"] build ["tracker-read2"] crash ["tracker-read3"] ["depended upon after being used","lint/tracker-source2"] build ["tracker-compile.o"] shake-0.15.5/src/Test/Journal.hs0000644000000000000000000000166612560222036014546 0ustar0000000000000000 module Test.Journal(main) where import Control.Monad import Data.IORef import Development.Shake import Development.Shake.FilePath import Test.Type import System.IO.Unsafe {-# NOINLINE rebuilt #-} rebuilt :: IORef Int rebuilt = unsafePerformIO $ newIORef 0 main = shaken test $ \args obj -> do want $ map obj ["a.out","b.out","c.out"] obj "*.out" %> \out -> do liftIO $ atomicModifyIORef rebuilt $ \a -> (a+1,()) copyFile' (out -<.> "in") out test build obj = do let change x = writeFile (obj $ x <.> "in") x let count x = do before <- readIORef rebuilt build ["--sleep"] after <- readIORef rebuilt x === after - before change "a" change "b" change "c" count 3 -- test that compressing the database doesn't corrupt anything replicateM_ 4 $ do change "a" count 1 change "a" change "c" count 2 shake-0.15.5/src/Test/Files.hs0000644000000000000000000000276012560222036014172 0ustar0000000000000000 module Test.Files(main) where import Development.Shake import Development.Shake.FilePath import Test.Type import Control.Monad import Data.List main = shaken test $ \args obj -> do let fun = "@" `elem` args let rest = delete "@" args want $ map obj $ if null rest then ["even.txt","odd.txt"] else rest -- Since &?> and &%> are implemented separately we test everything in both modes let deps &?%> act | fun = (\x -> if x `elem` deps then Just deps else Nothing) &?> act | otherwise = deps &%> act map obj ["even.txt","odd.txt"] &?%> \[evens,odds] -> do src <- readFileLines $ obj "numbers.txt" let (es,os) = partition even $ map read src writeFileLines evens $ map show es writeFileLines odds $ map show os map obj ["dir1/out.txt","dir2/out.txt"] &?%> \[a,b] -> do writeFile' a "a" writeFile' b "b" (\x -> let dir = takeDirectory x in if takeFileName dir /= "pred" then Nothing else Just [dir "a.txt",dir "b.txt"]) &?> \outs -> do mapM_ (`writeFile'` "") outs test build obj = do forM_ [[],["@"]] $ \args -> do let nums = unlines . map show writeFile (obj "numbers.txt") $ nums [1,2,4,5,2,3,1] build ("--sleep":args) assertContents (obj "even.txt") $ nums [2,4,2] assertContents (obj "odd.txt" ) $ nums [1,5,3,1] build ["clean"] build ["--no-build","--report=-"] build ["dir1/out.txt"] build ["pred/a.txt"] shake-0.15.5/src/Test/FilePattern.hs0000644000000000000000000000472512560222036015350 0ustar0000000000000000 module Test.FilePattern(main) where import Development.Shake.FilePattern import Development.Shake.FilePath import Data.Tuple.Extra import Test.Type main = shaken test $ \args obj -> return () test build obj = do let f b pat file = assert (b == (pat ?== file)) $ show pat ++ " ?== " ++ show file ++ "\nEXPECTED: " ++ show b f True "//*.c" "foo/bar/baz.c" f True (toNative "//*.c") "foo/bar\\baz.c" f True "*.c" "baz.c" f True "//*.c" "baz.c" f True "test.c" "test.c" f False "*.c" "foor/bar.c" f False "*/*.c" "foo/bar/baz.c" f False "foo//bar" "foobar" f False "foo//bar" "foobar/bar" f False "foo//bar" "foo/foobar" f True "foo//bar" "foo/bar" f True "foo/bar" (toNative "foo/bar") f True (toNative "foo/bar") "foo/bar" f True (toNative "foo/bar") (toNative "foo/bar") assert (compatible []) "compatible" assert (compatible ["//*a.txt","foo//a*.txt"]) "compatible" assert (not $ compatible ["//*a.txt","foo//a*.*txt"]) "compatible" extract "//*a.txt" "foo/bar/testa.txt" === ["foo/bar/","test"] extract "//*a.txt" "testa.txt" === ["","test"] extract "//*a*.txt" "testada.txt" === ["","test","da"] extract (toNative "//*a*.txt") "testada.txt" === ["","test","da"] substitute ["","test","da"] "//*a*.txt" === "testada.txt" substitute ["foo/bar/","test"] "//*a.txt" === "foo/bar/testa.txt" directories1 "*.xml" === ("",False) directories1 "//*.xml" === ("",True) directories1 "foo//*.xml" === ("foo",True) first toStandard (directories1 "foo/bar/*.xml") === ("foo/bar",False) directories1 "*/bar/*.xml" === ("",True) directories ["*.xml","//*.c"] === [("",True)] directories ["bar/*.xml","baz//*.c"] === [("bar",False),("baz",True)] directories ["bar/*.xml","baz//*.c"] === [("bar",False),("baz",True)] --------------------------------------------------------------------- -- LAZY SMALLCHECK PROPERTIES {- newtype Pattern = Pattern FilePattern deriving (Show,Eq) newtype Path = Path FilePath deriving (Show,Eq) -- Since / and * are the only "interesting" elements, just add ab to round out the set instance Serial Pattern where series = cons Pattern >< f where f = cons [] \/ cons (:) >< const (drawnFrom "/*ab") >< f instance Serial Path where series = cons Path >< f where f = cons [] \/ cons (:) >< const (drawnFrom "/ab") >< f testSmallCheck = do smallCheck 10 $ \(Pattern p) (Path x) -> p ?== x ==> substitute (extract p x) p == x -} shake-0.15.5/src/Test/FilePath.hs0000644000000000000000000000633012560222036014621 0ustar0000000000000000 module Test.FilePath(main) where import Development.Shake.FilePath import Development.Shake import qualified System.FilePath as Native import Test.Type import Test.QuickCheck import Control.Monad import Data.List import qualified Data.ByteString.Char8 as BS import qualified Development.Shake.ByteString as BS import System.Info.Extra main = shaken test $ \args obj -> return () newtype File = File String deriving Show instance Arbitrary File where arbitrary = fmap File $ listOf $ oneof $ map return "a /\\:." shrink (File x) = map File $ shrink x test build obj = do let a === b = a Test.Type.=== b -- duplicate definition in QuickCheck 2.7 and above let norm x = let s = toStandard $ normaliseEx x b = BS.unpack (BS.filepathNormalise $ BS.pack x) in if s == b then s else error $ show ("Normalise functions differ",x,s,b) -- basic examples norm "" === "." norm "." === "." norm "/" === "/" norm "./" === "./" norm "/." === "/." norm "/./" === "/" norm "a/." === "a" norm "./a" === "a" norm "./a/." === "a" norm "./a/./" === "a/" norm "a/.." === "." norm "a/./.." === "." norm "a/../" === "./" norm "/a/../" === "/" norm "/a/./../" === "/" norm "../a" === "../a" norm "/../a/" === "/../a/" -- more realistic examples norm "neil//./test/moo/../bar/bob/../foo" === "neil/test/bar/foo" norm "bar/foo" === "bar/foo" norm "bar/foo/" === "bar/foo/" norm "../../foo" === "../../foo" norm "foo/../..///" === "../" norm "foo/bar/../../neil" === "neil" norm "foo/../bar/../neil" === "neil" norm "/foo/bar" === "/foo/bar" norm "//./" === (if isWindows then "//" else "/") norm "//foo/./bar" === (if isWindows then "//foo/bar" else "/foo/bar") when isWindows $ norm "c:\\foo\\bar" === "c:/foo/bar" Success{} <- quickCheckWithResult stdArgs{maxSuccess=1000} $ \(File x) -> let y = norm x sep = Native.isPathSeparator noDrive = if isWindows then drop 1 else id ps = [length y >= 1 ,null x || (sep (head x) == sep (head y) && sep (last x) == sep (last y)) ,not $ "/./" `isInfixOf` y ,not isWindows || '\\' `notElem` y ,not $ "//" `isInfixOf` noDrive y ,".." `notElem` dropWhile (== "..") (dropWhile (\x -> all isPathSeparator x || isDrive x) $ splitDirectories y) ,norm y == y] in if and ps then True else error $ show (x, y, ps) dropDirectory1 "aaa/bbb" === "bbb" dropDirectory1 "aaa/" === "" dropDirectory1 "aaa" === "" dropDirectory1 "" === "" takeDirectory1 "aaa/bbb" === "aaa" takeDirectory1 "aaa/" === "aaa" takeDirectory1 "aaa" === "aaa" searchPathSeparator === Native.searchPathSeparator pathSeparators === Native.pathSeparators if isWindows then toNative "//this is a///test\\here/" === "\\\\this is a\\\\\\test\\here\\" else toNative "//this is a///test\\here/" === "//this is a///test\\here/" -- check common manipulations work ("//*" <.> "foo") === "//*.foo" toStandard ("a" "b" "c" "*" <.> "exe") === "a/b/c//*.exe" ("a" "/b/c") === "a//b/c" shake-0.15.5/src/Test/Errors.hs0000644000000000000000000001267612560222036014413 0ustar0000000000000000 module Test.Errors(main) where import Development.Shake import Development.Shake.FilePath import Test.Type import Data.List import Control.Monad import Control.Concurrent import Control.Exception.Extra hiding (assert) import System.Directory as IO import qualified System.IO.Extra as IO main = shaken2 test $ \args obj -> do obj "norule" %> \_ -> need [obj "norule_isavailable"] obj "failcreate" %> \_ -> return () [obj "failcreates", obj "failcreates2"] &%> \_ -> writeFile' (obj "failcreates") "" obj "recursive" %> \out -> need [out] obj "systemcmd" %> \_ -> cmd "random_missing_command" obj "stack1" %> \_ -> need [obj "stack2"] obj "stack2" %> \_ -> need [obj "stack3"] obj "stack3" %> \_ -> error "crash" obj "staunch1" %> \out -> do liftIO $ sleep 0.1 writeFile' out "test" obj "staunch2" %> \_ -> error "crash" let catcher out op = obj out %> \out -> do writeFile' out "0" op $ do src <- IO.readFile' out; writeFile out $ show (read src + 1 :: Int) catcher "finally1" $ actionFinally $ fail "die" catcher "finally2" $ actionFinally $ return () catcher "finally3" $ actionFinally $ liftIO $ sleep 10 catcher "finally4" $ actionFinally $ need ["wait"] "wait" ~> do liftIO $ sleep 10 catcher "exception1" $ actionOnException $ fail "die" catcher "exception2" $ actionOnException $ return () res <- newResource "resource_name" 1 obj "resource" %> \out -> do withResource res 1 $ need ["resource-dep"] obj "overlap.txt" %> \out -> writeFile' out "overlap.txt" obj "overlap.t*" %> \out -> writeFile' out "overlap.t*" obj "overlap.*" %> \out -> writeFile' out "overlap.*" alternatives $ do obj "alternative.t*" %> \out -> writeFile' out "alternative.txt" obj "alternative.*" %> \out -> writeFile' out "alternative.*" obj "chain.2" %> \out -> do src <- readFile' $ obj "chain.1" if src == "err" then error "err_chain" else writeFileChanged out src obj "chain.3" %> \out -> copyFile' (obj "chain.2") out obj "tempfile" %> \out -> do file <- withTempFile $ \file -> do liftIO $ assertExists file return file liftIO $ assertMissing file withTempFile $ \file -> do liftIO $ assertExists file writeFile' out file fail "tempfile-died" obj "tempdir" %> \out -> do file <- withTempDir $ \dir -> do let file = dir "foo.txt" liftIO $ writeFile (dir "foo.txt") "" -- will throw if the directory does not exist writeFile' out "" return file liftIO $ assertMissing file phony "fail1" $ fail "die1" phony "fail2" $ fail "die2" when ("die" `elem` args) $ action $ error "death error" test build obj = do let crash args parts = assertException parts (build $ "--quiet" : args) build ["clean"] build ["--sleep"] writeFile (obj "chain.1") "x" build ["$chain.3","--sleep"] writeFile (obj "chain.1") "err" crash ["$chain.3"] ["err_chain"] crash ["$norule"] ["norule_isavailable"] crash ["$failcreate"] ["failcreate"] crash ["$failcreates"] ["failcreates"] crash ["$recursive"] ["recursive"] crash ["$systemcmd"] ["systemcmd","random_missing_command"] crash ["$stack1"] ["stack1","stack2","stack3","crash"] b <- IO.doesFileExist $ obj "staunch1" when b $ removeFile $ obj "staunch1" crash ["$staunch1","$staunch2","-j2"] ["crash"] b <- IO.doesFileExist $ obj "staunch1" assert (not b) "File should not exist, should have crashed first" crash ["$staunch1","$staunch2","-j2","--keep-going","--silent"] ["crash"] b <- IO.doesFileExist $ obj "staunch1" assert b "File should exist, staunch should have let it be created" crash ["$finally1"] ["die"] assertContents (obj "finally1") "1" build ["$finally2"] assertContents (obj "finally2") "1" crash ["$exception1"] ["die"] assertContents (obj "exception1") "1" build ["$exception2"] assertContents (obj "exception2") "0" forM_ ["finally3","finally4"] $ \name -> do t <- forkIO $ ignore $ build ['$':name,"--exception"] retry 10 $ sleep 0.1 >> assertContents (obj name) "0" throwTo t (IndexOutOfBounds "test") retry 10 $ sleep 0.1 >> assertContents (obj name) "1" crash ["$resource"] ["cannot currently call apply","withResource","resource_name"] build ["$overlap.foo"] assertContents (obj "overlap.foo") "overlap.*" build ["$overlap.txt"] assertContents (obj "overlap.txt") "overlap.txt" crash ["$overlap.txx"] ["key matches multiple rules","overlap.txx"] build ["$alternative.foo","$alternative.txt"] assertContents (obj "alternative.foo") "alternative.*" assertContents (obj "alternative.txt") "alternative.txt" crash ["$tempfile"] ["tempfile-died"] src <- readFile $ obj "tempfile" assertMissing src build ["$tempdir"] crash ["!die"] ["Shake","action","death error"] putStrLn "## BUILD errors" (out,_) <- IO.captureOutput $ build [] assert ("nothing to do" `isInfixOf` out) $ "Expected 'nothing to do', but got: " ++ out putStrLn "## BUILD errors fail1 fail2 -k -j2" (out,_) <- IO.captureOutput $ try_ $ build ["fail1","fail2","-k","-j2",""] assert ("die1" `isInfixOf` out && "die2" `isInfixOf` out) $ "Expected 'die1' and 'die2', but got: " ++ out shake-0.15.5/src/Test/Docs.hs0000644000000000000000000002633512560222036014024 0ustar0000000000000000{-# LANGUAGE PatternGuards #-} module Test.Docs(main) where import Development.Shake import Development.Shake.FilePath import Test.Type import Data.Char import Data.List.Extra import Data.Maybe main = shaken noTest $ \args obj -> do let index = obj "dist/doc/html/shake/index.html" let config = obj "dist/setup-config" want [obj "Success.txt"] want $ map (\x -> fromMaybe (obj x) $ stripPrefix "!" x) args let needSource = need =<< getDirectoryFiles "." ["src/Development/Shake.hs","src/Development/Shake//*.hs","src/Development/Ninja/*.hs","src/General//*.hs"] config %> \_ -> do need ["shake.cabal"] unit $ cmd "runhaskell Setup.hs configure" ["--builddir=" ++ obj "dist","--user"] trackAllow [obj "dist//*"] index %> \_ -> do need [config,"shake.cabal"] needSource need ["shake.cabal"] trackAllow [obj "dist//*"] cmd "runhaskell Setup.hs haddock" ["--builddir=" ++ obj "dist"] obj "Paths_shake.hs" %> \out -> do copyFile' "src/Paths.hs" out obj "Part_*.hs" %> \out -> do need ["src/Test/Docs.hs"] -- so much of the generator is in this module let noR = filter (/= '\r') src <- if "_md" `isSuffixOf` takeBaseName out then fmap (findCodeMarkdown . lines . noR) $ readFile' $ "docs/" ++ drop 5 (reverse (drop 3 $ reverse $ takeBaseName out)) ++ ".md" else fmap (findCodeHaddock . noR) $ readFile' $ obj $ "dist/doc/html/shake/" ++ replace "_" "-" (drop 5 $ takeBaseName out) ++ ".html" let f i (Stmt x) | any whitelist x = [] | otherwise = restmt i $ map undefDots $ trims x f i (Expr x) | takeWhile (not . isSpace) x `elem` types = ["type Expr_" ++ show i ++ " = " ++ x] | "import " `isPrefixOf` x = [x] | otherwise = ["expr_" ++ show i ++ " = (" ++ undefDots x2 ++ ")" | let x2 = trim $ dropComment x, not $ whitelist x2] code = concat $ zipWith f [1..] (nubOrd src) (imports,rest) = partition ("import " `isPrefixOf`) code writeFileLines out $ ["{-# LANGUAGE DeriveDataTypeable, RankNTypes, MultiParamTypeClasses, ExtendedDefaultRules, GeneralizedNewtypeDeriving #-}" ,"{-# LANGUAGE NoMonomorphismRestriction, ScopedTypeVariables #-}" ,"{-# OPTIONS_GHC -w #-}" ,"module " ++ takeBaseName out ++ "() where" ,"import Control.Applicative" ,"import Control.Concurrent" ,"import Control.Monad" ,"import Data.Char" ,"import Data.Data" ,"import Data.List" ,"import Data.Maybe" ,"import Data.Monoid" ,"import Development.Shake" ,"import Development.Shake.Classes" ,"import Development.Shake.Rule" ,"import Development.Shake.Util" ,"import Development.Shake.FilePath" ,"import System.Console.GetOpt" ,"import System.Directory(setCurrentDirectory)" ,"import qualified System.Directory" ,"import System.Process" ,"import System.Exit" ,"import Control.Monad.IO.Class" ,"import System.IO"] ++ ["import " ++ replace "_" "." (drop 5 $ takeBaseName out) | not $ "_md.hs" `isSuffixOf` out] ++ imports ++ ["(==>) :: Bool -> Bool -> Bool" ,"(==>) = undefined" ,"(<==) = ()" ,"infix 1 ==>" ,"forAll f = f undefined" ,"remaining = 1.1" ,"done = 1.1" ,"time_elapsed = 1.1" ,"old = \"\"" ,"new = \"\"" ,"myfile = \"\"" ,"inputs = [\"\"]" ,"files = [\"\"]" ,"input = \"\"" ,"output = \"\"" ,"opts = shakeOptions" ,"result = undefined :: IO (Maybe (Rules ()))" ,"launchMissiles = undefined :: Bool -> IO ()" ,"myVariable = ()" ,"instance Eq (OptDescr a)" ,"(foo,bar,baz) = undefined" ,"str1 = \"\"" ,"str2 = \"\"" ,"str = \"\""] ++ rest obj "Files.lst" %> \out -> do need ["src/Test/Docs.hs"] -- so much of the generator is in this module need [index,obj "Paths_shake.hs"] filesHs <- getDirectoryFiles (obj "dist/doc/html/shake") ["Development-*.html"] filesMd <- getDirectoryFiles "docs" ["*.md"] writeFileChanged out $ unlines $ ["Part_" ++ replace "-" "_" (takeBaseName x) | x <- filesHs, not $ "-Classes.html" `isSuffixOf` x] ++ ["Part_" ++ takeBaseName x ++ "_md" | x <- filesMd, takeBaseName x `notElem` ["Developing","Model"]] let needModules = do mods <- readFileLines $ obj "Files.lst"; need [obj m <.> "hs" | m <- mods]; return mods obj "Main.hs" %> \out -> do mods <- needModules writeFileLines out $ ["module Main(main) where"] ++ ["import " ++ m | m <- mods] ++ ["main = return ()"] obj "Success.txt" %> \out -> do needModules need [obj "Main.hs", obj "Paths_shake.hs"] needSource () <- cmd "runhaskell -ignore-package=hashmap " ["-i" ++ obj "","-isrc",obj "Main.hs"] writeFile' out "" data Code = Stmt [String] | Expr String deriving (Show,Eq,Ord) findCodeHaddock :: String -> [Code] findCodeHaddock x | Just x <- stripPrefix "
" x = f (Stmt . shift . lines . strip) "
" x | Just x <- stripPrefix "" x = f (Expr . strip) "" x where f ctor end x | Just x <- stripPrefix end x = ctor "" : findCodeHaddock x f ctor end (x:xs) = f (ctor . (x:)) end xs findCodeHaddock (x:xs) = findCodeHaddock xs findCodeHaddock [] = [] findCodeMarkdown :: [String] -> [Code] findCodeMarkdown (x:xs) | indented x && not (blank x) = let (a,b) = span (\x -> indented x || blank x) (x:xs) in Stmt (map (drop 4) a) : findCodeMarkdown b where indented x = length (takeWhile isSpace x) >= 4 blank x = all isSpace x findCodeMarkdown (x:xs) = f x ++ findCodeMarkdown xs where f ('`':xs) = let (a,b) = break (== '`') xs in Expr a : f (drop 1 b) f (x:xs) = f xs f [] = [] findCodeMarkdown [] = [] trims = reverse . dropWhile (all isSpace) . reverse . dropWhile (all isSpace) restmt i ("":xs) = restmt i xs restmt i (('-':'-':_):xs) = restmt i xs restmt i (x:xs) | " ?== " `isInfixOf` x || " == " `isInfixOf` x = zipWith (\j x -> "hack_" ++ show i ++ "_" ++ show j ++ " = " ++ x) [1..] (x:xs) restmt i (x:xs) | not ("let" `isPrefixOf` x) && not ("[" `isPrefixOf` x) && not ("cmd " `isPrefixOf` x) && (" = " `isInfixOf` x || " | " `isInfixOf` x || " :: " `isInfixOf` x) || "import " `isPrefixOf` x || "infix" `isPrefixOf` x || "instance " `isPrefixOf` x = map f $ x:xs where f x = if takeWhile (not . isSpace) x `elem` dupes then "_" ++ show i ++ "_" ++ x else x restmt i xs = ("stmt_" ++ show i ++ " = do") : map (" " ++) xs ++ [" undefined" | length xs == 1 && ("let" `isPrefixOf` (head xs) || "<-" `isInfixOf` (head xs))] shift :: [String] -> [String] shift xs | all null xs = xs | all (\x -> null x || " " `isPrefixOf` x) xs = shift $ map (drop 1) xs | otherwise = xs dropComment ('-':'-':_) = [] dropComment xs = onTail dropComment xs undefDots o = f o where f ('.':'.':'.':xs) = (if any (`elem` words o) ["cmd","Development.Shake.cmd"] then "[\"\"]" else "undefined") ++ (if "..." `isSuffixOf` xs then "" else undefDots xs) f xs = onTail f xs strip :: String -> String strip x | Just x <- stripPrefix "" x , (a,b) <- break (== '<') x , not $ ("" `isPrefixOf` b) && a `elem` italics = error $ "Unexpected italics in code block: " ++ a ++ take 5 b ++ "..." strip ('<':xs) = strip $ drop 1 $ dropWhile (/= '>') xs strip ('&':xs) | Just xs <- stripPrefix "quot;" xs = '\"' : strip xs | Just xs <- stripPrefix "lt;" xs = '<' : strip xs | Just xs <- stripPrefix "gt;" xs = '>' : strip xs | Just xs <- stripPrefix "amp;" xs = '&' : strip xs strip xs = onTail strip xs onTail f (x:xs) = x : f xs onTail f [] = [] italics :: [String] italics = words "extension command-name file-name" whitelist :: String -> Bool whitelist x | all (not . isSpace) x && takeExtension x `elem` words ".txt .hi .hs .o .exe .tar .cpp .cfg .dep .deps .h .c .html .zip" = True whitelist x | elem x $ words $ "newtype do MyFile.txt.digits excel a q m c x value key gcc cl os make contents tar ghc cabal clean _make distcc ghc " ++ ".. /./ /.. /../ ./ // \\ ../ //*.c //*.txt //* dir/*/* dir " ++ "ConstraintKinds TemplateHaskell GeneralizedNewtypeDeriving DeriveDataTypeable SetConsoleTitle " ++ "Data.List System.Directory Development.Shake.FilePath main.m run .rot13 " ++ "NoProgress Error src rot13 .js .json .trace about://tracing " ++ ".make/i586-linux-gcc/output _make/.database foo/.. file.src file.out build " ++ "/usr/special /usr/special/userbinary $CFLAGS %PATH% -O2 -j8 -j -j1 " ++ "-threaded -rtsopts -I0 Function extension $OUT $C_LINK_FLAGS $PATH xterm $TERM main opts result flagValues argValues " ++ "HEADERS_DIR /path/to/dir CFLAGS let -showincludes -MMD gcc.version linkFlags temp pwd touch code out err " ++ "_metadata/.shake.database _shake _shake/build ./build.sh build.sh build.bat [out] manual " ++ "docs/manual _build _build/run ninja depfile build.ninja ByteString ProcessHandle " ++ "Rule CmdResult ShakeValue Monoid Monad Eq Typeable Data " ++ -- work only with constraint kinds "@ndm_haskell file-name " ++ "*> " = True whitelist x | "Stdout out" `isInfixOf` x || "Stderr err" `isInfixOf` x = True whitelist x | "foo/" `isPrefixOf` x -- path examples = True whitelist x = x `elem` ["[Foo.hi, Foo.o]" ,"shake-progress" ,"main -j6" ,"main clean" ,"1m25s (15%)" ,"3m12s (82%)" ,"getPkgVersion $ GhcPkgVersion \"shake\"" ,"# command-name (for file-name)" ,"ghc --make MyBuildSystem -rtsopts -with-rtsopts=-I0" ,"-with-rtsopts" ,"-qg -qb" ,"gcc -MM" ,"# This is my Config file" ,"-g -I/path/to/dir -O2" ,"main _make/henry.txt" ,"build rules" ,"actions" ,"() <- cmd ..." ,"x <- inputs" ,"shakeFiles=\"_build\"" ,"#include \"" ,"pattern %> actions = (pattern ?==) ?> actions" -- because it overlaps ,"buildDir = \"_build\"" ,"-MMD -MF" ,"#!/bin/sh" ,"build _build/main.o" ,"build clean" ,"build -j8" ,"cabal update && cabal install shake" ,"shake-build-system" ,"runhaskell Build.hs" ,"runhaskell Build.hs clean" ,"gcc -c main.c -o main.o -MMD -MF main.m" ,"\"_build\" x -<.> \"o\"" ,"cmd \"gcc -o\" [out] os" ,"rot13 file.txt -o file.rot13" ,"file.rot13" ,"out -<.> \"txt\"" ,"[item1,item2,item2]" ,"runhaskell Build.hs" ,"cabal update" ,"cabal install shake" ,"shake -j4" ,"cmd \"gcc -o _make/run _build/main.o _build/constants.o\"" ,"$(LitE . StringL . loc_filename <$> location)" ] types = words $ "MVar IO String FilePath Maybe [String] Char ExitCode Change " ++ "Action Resource Assume FilePattern Development.Shake.FilePattern " ++ "Lint Verbosity Rules CmdOption Int Double" dupes = words "main progressSimple rules" shake-0.15.5/src/Test/Directory.hs0000644000000000000000000001115412560222036015071 0ustar0000000000000000 module Test.Directory(main) where import Development.Shake import Development.Shake.FilePath import Test.Type import Data.List import Control.Monad import System.Directory(getCurrentDirectory, setCurrentDirectory, createDirectory, createDirectoryIfMissing) import qualified System.Directory as IO -- Use escape characters, _o=* _l=/ __= readEsc ('_':'o':xs) = '*' : readEsc xs readEsc ('_':'l':xs) = '/' : readEsc xs readEsc ('_':'_':xs) = ' ' : readEsc xs readEsc (x:xs) = x : readEsc xs readEsc [] = [] showEsc = concatMap f where f '*' = "_o" f '/' = "_l" f ' ' = "__" f x = [x] main = shaken test $ \args obj -> do want $ map obj args obj "*.contents" %> \out -> writeFileLines out =<< getDirectoryContents (obj $ readEsc $ dropExtension $ unobj out) obj "*.dirs" %> \out -> writeFileLines out =<< getDirectoryDirs (obj $ readEsc $ dropExtension $ unobj out) obj "*.files" %> \out -> do let pats = readEsc $ dropExtension $ unobj out let (x:xs) = ["" | " " `isPrefixOf` pats] ++ words pats writeFileLines out . map toStandard =<< getDirectoryFiles (obj x) xs obj "*.exist" %> \out -> do let xs = map obj $ words $ readEsc $ dropExtension $ unobj out fs <- mapM doesFileExist xs ds <- mapM doesDirectoryExist xs let bool x = if x then "1" else "0" writeFileLines out $ zipWith (\a b -> bool a ++ bool b) fs ds obj "dots" %> \out -> do cwd <- liftIO getCurrentDirectory liftIO $ setCurrentDirectory $ obj "" b1 <- liftM2 (==) (getDirectoryContents ".") (getDirectoryContents "") b2 <- liftM2 (==) (getDirectoryDirs ".") (getDirectoryDirs "") b3 <- liftM2 (==) (getDirectoryFiles "." ["*.txt"]) (getDirectoryFiles "" ["*.txt"]) b4 <- liftM2 (==) (getDirectoryFiles "." ["C.txt/*.txt"]) (getDirectoryFiles "" ["C.txt/*.txt"]) b5 <- liftM2 (==) (getDirectoryFiles "." ["//*.txt"]) (getDirectoryFiles "" ["//*.txt"]) liftIO $ setCurrentDirectory cwd writeFileLines out $ map show [b1,b2,b3,b4,b5] test build obj = do let demand x ys = let f = showEsc x in do build [f]; assertContents (obj f) $ unlines $ words ys build ["clean"] demand " *.txt.files" "" demand " //*.txt.files" "" demand ".dirs" "" demand "A.txt B.txt C.txt.exist" "00 00 00" writeFile (obj "A.txt") "" writeFile (obj "B.txt") "" createDirectory (obj "C.txt") writeFile (obj "C.txt/D.txt") "" writeFile (obj "C.txt/E.xtx") "" demand " *.txt.files" "A.txt B.txt" demand ".dirs" "C.txt" demand "A.txt B.txt C.txt.exist" "10 10 01" demand " //*.txt.files" "A.txt B.txt C.txt/D.txt" demand "C.txt *.txt.files" "D.txt" demand " *.txt //*.xtx.files" "A.txt B.txt C.txt/E.xtx" demand " C.txt/*.files" "C.txt/D.txt C.txt/E.xtx" build ["dots","--no-lint"] assertContents (obj "dots") $ unlines $ words "True True True True True" let removeTest pat del keep = do withTemporaryDirectory $ \dir -> do forM_ (del ++ keep) $ \s -> do createDirectoryIfMissing True $ dir takeDirectory s when (not $ hasTrailingPathSeparator s) $ writeFile (dir s) "" removeFiles dir pat createDirectoryIfMissing True dir forM_ (map ((,) False) del ++ map ((,) True) keep) $ \(b,s) -> do b2 <- (if hasTrailingPathSeparator s then IO.doesDirectoryExist else IO.doesFileExist) $ dir s when (b /= b2) $ do let f b = if b then "present" else "missing" error $ "removeFiles mismatch: with pattern " ++ show pat ++ ", " ++ s ++ " should be " ++ f b ++ " but is " ++ f b2 removeTest ["//bob"] ["test/bob","more/bob"] ["extra/obo"] removeTest ["bob"] ["bob/"] ["bar/"] removeTest ["*.hs"] ["test.hs"] ["extra/more.hs","new.txt"] removeTest ["baz"] ["baz"] ["foo","bar/bob"] removeTest ["baz"] ["baz/bob","baz/"] ["foo","bar/bob"] removeTest ["Foo//*"] ["Foo/bar","Foo/Quux/bar","Foo/Quux/"] [] removeTest ["Foo//*"] ["Foo/"] ["bar"] removeTest ["baz"] [] ["test.hs","bar/","foo/"] removeTest ["bob//*"] [] ["test/bob/"] removeTest ["//bob"] ["test/bob/","test/"] [] removeTest ["//*.txt"] ["more/","more/a.txt"] [] removeTest ["//*.txt"] ["more/","more/a.txt/"] [] removeTest ["//*.txt"] ["more/","more/a.txt/","more/b.txt"] [] removeTest ["//*.txt"] [] ["more/"] removeTest ["a//b"] ["a/c/b"] [] removeFiles "non-existing-directory" ["*"] shake-0.15.5/src/Test/Digest.hs0000644000000000000000000000447612560222036014355 0ustar0000000000000000 module Test.Digest(main) where import Control.Monad import Development.Shake import Test.Type main = shaken test $ \args obj -> do if null args then want [obj "Out.txt",obj "Out2.txt"] else want $ map obj args obj "Out.txt" %> \out -> do txt <- readFile' $ obj "In.txt" liftIO $ appendFile out txt [obj "Out1.txt",obj "Out2.txt"] &%> \[out1,out2] -> do txt <- readFile' $ obj "In.txt" liftIO $ appendFile out1 txt liftIO $ appendFile out2 txt "leaf" ~> return () obj "node1.txt" %> \file -> do need ["leaf"]; writeFile' file "x" obj "node2.txt" %> \file -> do need [obj "node1.txt"]; liftIO $ appendFile file "x" test build obj = do let outs = take 1 $ map obj ["Out.txt","Out1.txt","Out2.txt"] let writeOut x = forM_ outs $ \out -> writeFile out x let writeIn x = writeFile (obj "In.txt") x let assertOut x = forM_ outs $ \out -> assertContents out x writeOut "" writeIn "X" build ["--sleep","--digest-and"] assertOut "X" -- should not involve a hash calculation (sadly no way to test that) build ["--sleep","--digest-and"] assertOut "X" writeIn "X" build ["--sleep","--digest-and"] assertOut "X" writeIn "X" build ["--sleep","--digest-or"] assertOut "XX" writeIn "X" build ["--sleep","--digest-and"] assertOut "XX" build ["--sleep","--digest-and"] writeOut "XX" build ["--sleep","--digest-and"] assertOut "XX" build ["--sleep","--digest-and"] writeOut "Y" build ["--sleep","--digest-and"] assertOut "YX" writeIn "X" build ["--sleep","--digest"] assertOut "YX" writeIn "Z" build ["--sleep","--digest-and-input"] assertOut "YXZ" build ["--sleep","--digest-and-input"] writeOut "YXZ" build ["--sleep","--digest-and-input"] assertOut "YXZZ" writeIn "Q" build ["--sleep","--digest-and-input"] assertOut "YXZZQ" writeIn "Q" build ["--sleep","--digest-and-input"] assertOut "YXZZQ" -- test for #218 forM_ [("--digest",1),("--digest-and",1),("--digest-or",2),("--digest-and-input",2),("",2)] $ \(flag,count) -> do writeFile (obj "node2.txt") "y" replicateM_ 2 $ build $ ["node2.txt","--sleep"] ++ [flag | flag /= ""] assertContents (obj "node2.txt") $ 'y' : replicate count 'x' shake-0.15.5/src/Test/Config.hs0000644000000000000000000000433612560222036014336 0ustar0000000000000000 module Test.Config(main) where import Development.Shake import Development.Shake.FilePath import Development.Shake.Config import Test.Type import Data.Char import qualified Data.HashMap.Strict as Map import Data.Maybe import System.Directory main = shaken test $ \args obj -> do want $ map obj ["hsflags.var","cflags.var","none.var","keys"] usingConfigFile $ obj "config" obj "*.var" %> \out -> do cfg <- getConfig $ map toUpper $ takeBaseName out liftIO $ appendFile (out -<.> "times") "X" writeFile' out $ fromMaybe "" cfg obj "keys" %> \out -> do liftIO $ appendFile (obj "keys.times") "X" liftIO . writeFile out . unwords =<< getConfigKeys test build obj = do build ["clean"] createDirectoryIfMissing True $ obj "" writeFile (obj "config") $ unlines ["HEADERS_DIR = /path/to/dir" ,"CFLAGS = -O2 -I${HEADERS_DIR} -g" ,"HSFLAGS = -O2"] build [] assertContents (obj "cflags.var") "-O2 -I/path/to/dir -g" assertContents (obj "hsflags.var") "-O2" assertContents (obj "none.var") "" assertContents (obj "keys") "CFLAGS HEADERS_DIR HSFLAGS" appendFile (obj "config") $ unlines ["CFLAGS = $CFLAGS -w"] build [] assertContents (obj "cflags.var") "-O2 -I/path/to/dir -g -w" assertContents (obj "hsflags.var") "-O2" assertContents (obj "cflags.times") "XX" assertContents (obj "hsflags.times") "X" assertContents (obj "keys.times") "X" -- Test readConfigFileWithEnv writeFile (obj "config") $ unlines ["HEADERS_DIR = ${SOURCE_DIR}/path/to/dir" ,"CFLAGS = -O2 -I${HEADERS_DIR} -g"] vars <- readConfigFileWithEnv [("SOURCE_DIR", "/path/to/src")] (obj "config") assert (Map.lookup "HEADERS_DIR" vars == Just "/path/to/src/path/to/dir") $ "readConfigFileWithEnv:" ++ " Expected: " ++ show (Just "/path/to/src/path/to/dir") ++ " Got: " ++ show (Map.lookup "HEADERS_DIR" vars) assert (Map.lookup "CFLAGS" vars == Just "-O2 -I/path/to/src/path/to/dir -g") $ "readConfigFileWithEnv:" ++ " Expected: " ++ show (Just "-O2 -I/path/to/src/path/to/dir -g") ++ " Got: " ++ show (Map.lookup "CFLAGS" vars) shake-0.15.5/src/Test/Command.hs0000644000000000000000000001414112560222036014502 0ustar0000000000000000{-# LANGUAGE Rank2Types, ScopedTypeVariables #-} module Test.Command(main) where import Control.Applicative import Development.Shake import Development.Shake.FilePath import Control.Exception.Extra import System.Time.Extra import Control.Monad.Extra import System.Directory import Test.Type import System.Exit import System.Process import Data.Tuple.Extra import Data.List.Extra import Control.Monad.IO.Class import System.Info.Extra import qualified Data.ByteString.Char8 as BS import qualified Data.ByteString.Lazy.Char8 as LBS import Prelude main = shaken test $ \args obj -> do let helper = [toNative $ obj "shake_helper" <.> exe] let name !> test = do want [name | null args || name `elem` args] name ~> do need [obj "shake_helper" <.> exe]; test let helper_source = unlines ["import Control.Concurrent" ,"import Control.Monad" ,"import System.Directory" ,"import System.Environment" ,"import System.Exit" ,"import System.IO" ,"import qualified Data.ByteString.Lazy.Char8 as LBS" ,"main = do" ," args <- getArgs" ," forM_ args $ \\(a:rg) -> do" ," case a of" ," 'o' -> putStrLn rg" ," 'e' -> hPutStrLn stderr rg" ," 'x' -> exitFailure" ," 'c' -> putStrLn =<< getCurrentDirectory" ," 'v' -> putStrLn =<< getEnv rg" ," 'w' -> threadDelay $ floor $ 1000000 * (read rg :: Double)" ," 'r' -> LBS.putStr $ LBS.replicate (read rg) 'x'" ," 'i' -> putStr =<< getContents" ," hFlush stdout" ," hFlush stderr" ] obj "shake_helper.hs" %> \out -> do need ["src/Test/Command.hs"]; writeFileChanged out helper_source obj "shake_helper" <.> exe %> \out -> do need [obj "shake_helper.hs"]; cmd (Cwd $ obj "") "ghc --make" "shake_helper.hs -o shake_helper" "capture" !> do (Stderr err, Stdout out) <- cmd helper ["ostuff goes here","eother stuff here"] liftIO $ out === "stuff goes here\n" liftIO $ err === "other stuff here\n" liftIO $ waits $ \w -> do Stdouterr out <- cmd helper Shell ["o1",w,"e2",w,"o3"] out === "1\n2\n3\n" "failure" !> do (Exit e, Stdout (), Stderr ()) <- cmd helper "oo ee x" when (e == ExitSuccess) $ error "/= ExitSuccess" liftIO $ assertException ["BAD"] $ cmd helper "oo eBAD x" (EchoStdout False) (EchoStderr False) liftIO $ assertException ["MORE"] $ cmd helper "oMORE eBAD x" (WithStdout True) (WithStderr False) (EchoStdout False) (EchoStderr False) "cwd" !> do -- FIXME: Linux searches the Cwd argument for the file, Windows searches getCurrentDirectory helper <- liftIO $ canonicalizePath $ obj "shake_helper" <.> exe Stdout out <- cmd (Cwd $ obj "") helper "c" liftIO $ (===) (trim out) =<< canonicalizePath (dropTrailingPathSeparator $ obj "") "timeout" !> do offset <- liftIO offsetTime Exit exit <- cmd (Timeout 2) helper "w20" t <- liftIO offset putNormal $ "Timed out in " ++ showDuration t when (exit == ExitSuccess) $ error "== ExitSuccess" when (t < 2 || t > 8) $ error $ "failed to timeout, took " ++ show t "env" !> do -- use liftIO since it blows away PATH which makes lint-tracker stop working Stdout out <- liftIO $ cmd (Env [("FOO","HELLO SHAKE")]) Shell helper "vFOO" liftIO $ out === "HELLO SHAKE\n" Stdout out <- cmd (AddEnv "FOO" "GOODBYE SHAKE") Shell helper "vFOO" liftIO $ out === "GOODBYE SHAKE\n" "path" !> do let path = AddPath [dropTrailingPathSeparator $ obj ""] [] unit $ cmd $ obj "shake_helper" unit $ cmd $ obj "shake_helper" <.> exe unit $ cmd path Shell "shake_helper" unit $ cmd path "shake_helper" "file" !> do let file = obj "file.txt" unit $ cmd helper (FileStdout file) (FileStderr file) (EchoStdout False) (EchoStderr False) (WithStderr False) "ofoo ebar obaz" liftIO $ assertContents file "foo\nbar\nbaz\n" liftIO $ waits $ \w -> do Stderr err <- cmd helper (FileStdout file) (FileStderr file) ["ofoo",w,"ebar",w,"obaz"] err === "bar\n" assertContents file "foo\nbar\nbaz\n" "timer" !> do timer $ cmd helper "binary" !> do (Stdout str, Stdout bs) <- cmd BinaryPipes helper "ofoo" liftIO $ (===) (str, bs) $ second BS.pack $ dupe $ if isWindows then "foo\r\n" else "foo\n" (Stdout str, Stdout bs) <- cmd helper "ofoo" liftIO $ (str, bs) === ("foo\n", BS.pack $ if isWindows then "foo\r\n" else "foo\n") return () "large" !> do (Stdout (_ :: String), CmdTime t1) <- cmd helper "r10000000" (Stdout (_ :: LBS.ByteString), CmdTime t2) <- cmd helper "r10000000" t3 <- withTempFile $ \file -> fromCmdTime <$> cmd helper "r10000000" (FileStdout file) liftIO $ putStrLn $ "Capturing 10Mb takes: " ++ intercalate "," [s ++ " = " ++ showDuration d | (s,d) <- [("String",t1),("ByteString",t2),("File",t3)]] "stdin" !> do Stdout (x :: String) <- cmd helper "i" (Stdin "hello ") (StdinBS $ LBS.pack "world") liftIO $ x === "hello world" "async" !> do let file = obj "async.txt" pid <- cmd helper (FileStdout file) "w2" "ohello" Nothing <- liftIO $ getProcessExitCode pid ExitSuccess <- liftIO $ waitForProcess pid liftIO $ assertContents file "hello\n" test build obj = do -- reduce the overhead by running all the tests in parallel build ["-j4"] timer :: (CmdResult r, MonadIO m) => (forall r . CmdResult r => m r) -> m r timer act = do (CmdTime t, CmdLine x, r) <- act liftIO $ putStrLn $ "Command " ++ x ++ " took " ++ show t ++ " seconds" return r waits :: (String -> IO ()) -> IO () waits op = f 0 where f w | w > 1 = op "w10" | otherwise = catch_ (op $ "w" ++ show w) $ const $ f $ w + 0.1 shake-0.15.5/src/Test/Cache.hs0000644000000000000000000000216712560222036014134 0ustar0000000000000000 module Test.Cache(main) where import Development.Shake import Development.Shake.FilePath import Data.Char import Test.Type main = shaken test $ \args obj -> do want $ map obj args vowels <- newCache $ \file -> do src <- readFile' file liftIO $ appendFile (obj "trace.txt") "1" return $ length $ filter isDigit src obj "*.out*" %> \x -> writeFile' x . show =<< vowels (dropExtension x <.> "txt") test build obj = do writeFile (obj "trace.txt") "" writeFile (obj "vowels.txt") "abc123a" build ["vowels.out1","vowels.out2","-j3","--sleep"] assertContents (obj "trace.txt") "1" assertContents (obj "vowels.out1") "3" assertContents (obj "vowels.out2") "3" build ["vowels.out2","-j3"] assertContents (obj "trace.txt") "1" assertContents (obj "vowels.out1") "3" writeFile (obj "vowels.txt") "12xyz34" build ["vowels.out2","-j3","--sleep"] assertContents (obj "trace.txt") "11" assertContents (obj "vowels.out2") "4" build ["vowels.out1","-j3","--sleep"] assertContents (obj "trace.txt") "111" assertContents (obj "vowels.out1") "4" shake-0.15.5/src/Test/C.hs0000644000000000000000000000130612560222036013305 0ustar0000000000000000 module Test.C(main) where import Development.Shake import Development.Shake.FilePath import Test.Type main = shaken noTest $ \args obj -> do let src = "src/Test/C" want [obj "Main.exe"] obj "Main.exe" %> \out -> do cs <- getDirectoryFiles src ["*.c"] let os = map (obj . (<.> "o")) cs need os cmd "gcc -o" [out] os obj "*.c.o" %> \out -> do let c = src takeBaseName out need [c] headers <- cIncludes c need $ map (() src . takeFileName) headers cmd "gcc -o" [out] "-c" [c] cIncludes :: FilePath -> Action [FilePath] cIncludes x = do Stdout stdout <- cmd "gcc" ["-MM",x] return $ drop 2 $ words stdout shake-0.15.5/src/Test/Benchmark.hs0000644000000000000000000000132212560222036015013 0ustar0000000000000000 module Test.Benchmark(main) where import Development.Shake import Test.Type import Data.List import Development.Shake.FilePath -- | Given a breadth and depth come up with a set of build files main = shaken (\_ _ -> return ()) $ \args obj -> do let get ty = head $ [read $ drop (length ty + 1) a | a <- args, (ty ++ "=") `isPrefixOf` a] ++ error ("Could not find argument, expected " ++ ty ++ "=Number") depth = get "depth" breadth = get "breadth" want [obj $ "0." ++ show i | i <- [1..breadth]] obj "*" %> \out -> do let d = read $ takeBaseName out need [obj $ show (d + 1) ++ "." ++ show i | d < depth, i <- [1..breadth]] writeFile' out "" shake-0.15.5/src/Test/Basic.hs0000644000000000000000000001044512560222036014150 0ustar0000000000000000 module Test.Basic(main) where import Development.Shake import Test.Type import System.Directory as IO import Data.List import Data.Maybe import Control.Monad import General.Extra main = shaken test $ \args obj -> do want $ map (\x -> fromMaybe (obj x) $ stripPrefix "!" x) args obj "AB.txt" %> \out -> do need [obj "A.txt", obj "B.txt"] text1 <- readFile' $ obj "A.txt" text2 <- readFile' $ obj "B.txt" writeFile' out $ text1 ++ text2 obj "twice.txt" %> \out -> do let src = obj "once.txt" need [src, src] copyFile' src out obj "once.txt" %> \out -> do src <- readFile' $ obj "zero.txt" writeFile' out src phonys $ \x -> if x /= "halfclean" then Nothing else Just $ do removeFilesAfter (obj "") ["//*e.txt"] phony "cleaner" $ do removeFilesAfter (obj "") ["//*"] phony (obj "configure") $ do liftIO $ appendFile (obj "configure") "1" phony "install" $ do need [obj "configure",obj "once.txt"] liftIO $ appendFile (obj "install") "1" phony "dummy" $ do liftIO $ appendFile (obj "dummy") "1" phony "threads" $ do x <- getShakeOptions writeFile' (obj "threads.txt") $ show $ shakeThreads x obj "dummer.txt" %> \out -> do need ["dummy","dummy"] need ["dummy"] liftIO $ appendFile out "1" r <- newResource ".log file" 1 let trace x = withResource r 1 $ liftIO $ appendFile (obj ".log") x obj "*.par" %> \out -> do trace "[" (if "unsafe" `isInfixOf` out then unsafeExtraThread else id) $ liftIO $ sleep 0.1 trace "]" writeFile' out out test build obj = do writeFile (obj "A.txt") "AAA" writeFile (obj "B.txt") "BBB" build ["AB.txt","--sleep"] assertContents (obj "AB.txt") "AAABBB" appendFile (obj "A.txt") "aaa" build ["AB.txt"] assertContents (obj "AB.txt") "AAAaaaBBB" removeFile $ obj "AB.txt" build ["AB.txt"] assertContents (obj "AB.txt") "AAAaaaBBB" writeFile (obj "zero.txt") "xxx" build ["twice.txt","--sleep"] assertContents (obj "twice.txt") "xxx" writeFile (obj "zero.txt") "yyy" build ["once.txt","--sleep"] assertContents (obj "twice.txt") "xxx" assertContents (obj "once.txt") "yyy" writeFile (obj "zero.txt") "zzz" build ["once.txt","twice.txt","--sleep"] assertContents (obj "twice.txt") "zzz" assertContents (obj "once.txt") "zzz" removeFile $ obj "twice.txt" build ["twice.txt"] assertContents (obj "twice.txt") "zzz" show shakeOptions === show shakeOptions build ["!halfclean"] b <- IO.doesDirectoryExist (obj "") assert b "Directory should exist, cleaner should not have removed it" build ["!cleaner"] sleep 1 -- sometimes takes a while for the file system to notice b <- IO.doesDirectoryExist (obj "") assert (not b) "Directory should not exist, cleaner should have removed it" IO.createDirectory $ obj "" writeFile (obj "zero.txt") "" build ["configure"] build ["!install"] build ["!install"] assertContents (obj "configure") "111" assertContents (obj "install") "11" writeFile (obj "dummy.txt") "" build ["!dummy"] assertContents (obj "dummy") "1" build ["!dummy"] assertContents (obj "dummy") "11" build ["!dummy","!dummy"] assertContents (obj "dummy") "111" writeFile (obj "dummer.txt") "" build ["dummer.txt"] assertContents (obj "dummer.txt") "1" build ["dummer.txt"] assertContents (obj "dummer.txt") "11" build ["1.par","2.par","-j1"] assertContents (obj ".log") "[][]" writeFile (obj ".log") "" build ["3.par","4.par","-j2"] assertContents (obj ".log") "[[]]" writeFile (obj ".log") "" processors <- getProcessorCount putStrLn $ "getProcessorCount returned " ++ show processors when (processors > 1) $ do build ["5.par","6.par","-j0"] assertContents (obj ".log") "[[]]" writeFile (obj ".log") "" build ["unsafe1.par","unsafe2.par","-j2"] assertContents (obj ".log") "[[]]" build ["!threads","-j3"] assertContents (obj "threads.txt") "3" build ["!threads","-j0"] assertContents (obj "threads.txt") (show processors) build [] -- should say "no want/action statements, nothing to do" (checked manually) shake-0.15.5/src/Test/Assume.hs0000644000000000000000000000201712560222036014360 0ustar0000000000000000 module Test.Assume(main) where import Development.Shake import Test.Type import Control.Monad import Development.Shake.FilePath main = shaken test $ \args obj -> do want $ map obj args obj "*.out" %> \out -> do cs <- mapM (readFile' . obj . (:".src")) $ takeBaseName out writeFile' out $ concat cs test build obj = do let set file c = writeFile (obj $ file : ".src") [c] let ask file c = do src <- readFile (obj $ file ++ ".out"); src === c forM_ ['a'..'f'] $ \c -> set c c build ["--sleep","abc.out"] ask "abc" "abc" set 'b' 'd' build ["--sleep","abc.out"] ask "abc" "adc" set 'b' 'p' build ["--sleep","abc.out","--touch"] build ["abc.out"] ask "abc" "adc" set 'c' 'z' build ["--sleep","abc.out"] ask "abc" "apz" build ["bc.out","c.out"] ask "bc" "pz" set 'b' 'r' set 'c' 'n' build ["--sleep","abc.out","--touch"] ask "abc" "apz" build ["ab.out","--always-make"] ask "ab" "ar" build ["c.out"] ask "c" "z" shake-0.15.5/src/Test/Tup/0000755000000000000000000000000012560222036013337 5ustar0000000000000000shake-0.15.5/src/Test/Tup/root.cfg0000644000000000000000000000010612560222036015000 0ustar0000000000000000 hello.exe = hello.c newmath.a include src/Test/Tup/newmath/root.cfg shake-0.15.5/src/Test/Tup/hello.c0000644000000000000000000000023112560222036014602 0ustar0000000000000000#include #include "square.h" int main(void) { printf("Hi, everybody!\n"); printf("Five squared is: %i\n", square(5)); return 0; } shake-0.15.5/src/Test/Tup/newmath/0000755000000000000000000000000012560222036015002 5ustar0000000000000000shake-0.15.5/src/Test/Tup/newmath/square.h0000644000000000000000000000002312560222036016446 0ustar0000000000000000int square(int x); shake-0.15.5/src/Test/Tup/newmath/square.c0000644000000000000000000000007512560222036016450 0ustar0000000000000000#include "square.h" int square(int x) { return x * x; } shake-0.15.5/src/Test/Tup/newmath/root.cfg0000644000000000000000000000002612560222036016444 0ustar0000000000000000 newmath.a = square.c shake-0.15.5/src/Test/Tar/0000755000000000000000000000000012560222036013315 5ustar0000000000000000shake-0.15.5/src/Test/Tar/list.txt0000644000000000000000000000006112560222036015026 0ustar0000000000000000src/Test/Tar.hs src/Run.hs src/Test/Tar/list.txt shake-0.15.5/src/Test/Progress/0000755000000000000000000000000012560222036014373 5ustar0000000000000000shake-0.15.5/src/Test/Progress/self-zero-j2.prog0000644000000000000000000003006412560222036017506 0ustar0000000000000000(1.020102,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 254, countUnknown = 4, countTodo = 148, timeSkipped = 0.0, timeBuilt = 0.7620761110447347, timeUnknown = 2.0004000980407e-3, timeTodo = (146.13152742385864,0)}) (3.7443745,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 258, countUnknown = 1, countTodo = 147, timeSkipped = 0.0, timeBuilt = 1.0521051175892353, timeUnknown = 0.0, timeTodo = (145.76045322418213,0)}) (5.032503,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 260, countUnknown = 1, countTodo = 145, timeSkipped = 0.0, timeBuilt = 3.678367782384157, timeUnknown = 0.0, timeTodo = (143.16193342208862,0)}) (6.339634,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 261, countUnknown = 1, countTodo = 144, timeSkipped = 0.0, timeBuilt = 6.317631412297487, timeUnknown = 0.0, timeTodo = (140.56067323684692,0)}) (7.6067605,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 262, countUnknown = 1, countTodo = 143, timeSkipped = 0.0, timeBuilt = 8.911890674382448, timeUnknown = 0.0, timeTodo = (137.99141645431519,0)}) (8.867887,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 265, countUnknown = 1, countTodo = 140, timeSkipped = 0.0, timeBuilt = 11.489148784428835, timeUnknown = 0.0, timeTodo = (135.31988191604614,0)}) (10.124012,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 266, countUnknown = 1, countTodo = 139, timeSkipped = 0.0, timeBuilt = 14.01840179041028, timeUnknown = 0.0, timeTodo = (132.74862432479858,0)}) (11.394139,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 270, countUnknown = 1, countTodo = 135, timeSkipped = 0.0, timeBuilt = 16.545654464513063, timeUnknown = 0.0, timeTodo = (130.13927960395813,0)}) (12.661266,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 278, countUnknown = 1, countTodo = 127, timeSkipped = 0.0, timeBuilt = 19.09290998056531, timeUnknown = 0.0, timeTodo = (127.51380348205566,0)}) (13.928392,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 280, countUnknown = 1, countTodo = 125, timeSkipped = 0.0, timeBuilt = 21.638164687901735, timeUnknown = 0.0, timeTodo = (124.94054412841797,0)}) (15.19352,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 283, countUnknown = 1, countTodo = 122, timeSkipped = 0.0, timeBuilt = 24.176418472081423, timeUnknown = 0.0, timeTodo = (122.32628202438354,0)}) (16.462646,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 284, countUnknown = 1, countTodo = 121, timeSkipped = 0.0, timeBuilt = 26.708671737462282, timeUnknown = 0.0, timeTodo = (119.71102046966553,0)}) (17.723772,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 286, countUnknown = 1, countTodo = 119, timeSkipped = 0.0, timeBuilt = 29.24492471292615, timeUnknown = 0.0, timeTodo = (117.13076210021973,0)}) (18.9989,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 290, countUnknown = 1, countTodo = 115, timeSkipped = 0.0, timeBuilt = 31.782178092747927, timeUnknown = 0.0, timeTodo = (114.32748413085938,0)}) (20.265026,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 294, countUnknown = 1, countTodo = 111, timeSkipped = 0.0, timeBuilt = 34.328432250767946, timeUnknown = 0.0, timeTodo = (110.95114421844482,0)}) (21.525152,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 296, countUnknown = 1, countTodo = 109, timeSkipped = 0.0, timeBuilt = 36.87168710306287, timeUnknown = 0.0, timeTodo = (108.3658857345581,0)}) (22.792278,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 297, countUnknown = 1, countTodo = 108, timeSkipped = 0.0, timeBuilt = 39.39593904092908, timeUnknown = 0.0, timeTodo = (105.79362773895264,0)}) (24.09941,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 301, countUnknown = 1, countTodo = 104, timeSkipped = 0.0, timeBuilt = 41.934190917760134, timeUnknown = 0.0, timeTodo = (103.46539211273193,0)}) (25.552555,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 307, countUnknown = 1, countTodo = 98, timeSkipped = 0.0, timeBuilt = 44.71846978738904, timeUnknown = 0.0, timeTodo = (100.06105041503906,0)}) (27.156715,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 310, countUnknown = 1, countTodo = 95, timeSkipped = 0.0, timeBuilt = 47.48474519327283, timeUnknown = 0.0, timeTodo = (97.22976875305176,0)}) (28.427843,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 311, countUnknown = 1, countTodo = 94, timeSkipped = 0.0, timeBuilt = 50.347032714635134, timeUnknown = 0.0, timeTodo = (94.5565013885498,0)}) (29.692968,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 312, countUnknown = 1, countTodo = 93, timeSkipped = 0.0, timeBuilt = 53.222318816930056, timeUnknown = 0.0, timeTodo = (91.93223762512207,0)}) (30.960096,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 314, countUnknown = 1, countTodo = 91, timeSkipped = 0.0, timeBuilt = 55.76357476785779, timeUnknown = 0.0, timeTodo = (89.2559642791748,0)}) (32.22322,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 317, countUnknown = 1, countTodo = 88, timeSkipped = 0.0, timeBuilt = 58.305828262120485, timeUnknown = 0.0, timeTodo = (86.59769439697266,0)}) (33.50235,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 321, countUnknown = 1, countTodo = 84, timeSkipped = 0.0, timeBuilt = 60.847080398350954, timeUnknown = 0.0, timeTodo = (83.98942756652832,0)}) (34.50545,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 321, countUnknown = 1, countTodo = 84, timeSkipped = 0.0, timeBuilt = 60.847080398350954, timeUnknown = 0.0, timeTodo = (83.98942756652832,0)}) (35.778576,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 327, countUnknown = 1, countTodo = 78, timeSkipped = 0.0, timeBuilt = 64.42744653299451, timeUnknown = 0.0, timeTodo = (79.1019344329834,0)}) (37.039703,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 331, countUnknown = 1, countTodo = 74, timeSkipped = 0.0, timeBuilt = 67.99280183389783, timeUnknown = 0.0, timeTodo = (75.79260063171387,0)}) (38.30183,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 332, countUnknown = 1, countTodo = 73, timeSkipped = 0.0, timeBuilt = 70.5080529935658, timeUnknown = 0.0, timeTodo = (73.16133689880371,0)}) (39.60196,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 337, countUnknown = 1, countTodo = 68, timeSkipped = 0.0, timeBuilt = 73.04030625894666, timeUnknown = 0.0, timeTodo = (70.54007148742676,0)}) (40.9971,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 341, countUnknown = 1, countTodo = 64, timeSkipped = 0.0, timeBuilt = 75.7465783841908, timeUnknown = 0.0, timeTodo = (67.84580421447754,0)}) (42.251225,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 343, countUnknown = 1, countTodo = 62, timeSkipped = 0.0, timeBuilt = 78.44684617593884, timeUnknown = 0.0, timeTodo = (64.56347465515137,0)}) (43.51035,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 348, countUnknown = 1, countTodo = 57, timeSkipped = 0.0, timeBuilt = 80.97609345987439, timeUnknown = 0.0, timeTodo = (61.85720443725586,0)}) (45.31053,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 354, countUnknown = 1, countTodo = 51, timeSkipped = 0.0, timeBuilt = 85.3115312345326, timeUnknown = 0.0, timeTodo = (56.32666015625,0)}) (46.31663,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 354, countUnknown = 1, countTodo = 51, timeSkipped = 0.0, timeBuilt = 85.3115312345326, timeUnknown = 0.0, timeTodo = (56.32666015625,0)}) (47.953796,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 357, countUnknown = 1, countTodo = 48, timeSkipped = 0.0, timeBuilt = 87.96779649332166, timeUnknown = 0.0, timeTodo = (52.5462760925293,0)}) (49.22192,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 357, countUnknown = 1, countTodo = 48, timeSkipped = 0.0, timeBuilt = 87.96779649332166, timeUnknown = 0.0, timeTodo = (52.5462760925293,0)}) (50.483047,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 358, countUnknown = 1, countTodo = 47, timeSkipped = 0.0, timeBuilt = 90.48504655435681, timeUnknown = 0.0, timeTodo = (49.943016052246094,0)}) (51.742172,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 362, countUnknown = 1, countTodo = 43, timeSkipped = 0.0, timeBuilt = 94.27242677286267, timeUnknown = 0.0, timeTodo = (46.054622650146484,0)}) (53.0143,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 364, countUnknown = 1, countTodo = 41, timeSkipped = 0.0, timeBuilt = 96.79567353799939, timeUnknown = 0.0, timeTodo = (43.455360412597656,0)}) (54.279427,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 365, countUnknown = 1, countTodo = 40, timeSkipped = 0.0, timeBuilt = 99.32792680338025, timeUnknown = 0.0, timeTodo = (40.88410568237305,0)}) (55.549553,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 368, countUnknown = 1, countTodo = 37, timeSkipped = 0.0, timeBuilt = 101.87118165567517, timeUnknown = 0.0, timeTodo = (38.3088493347168,0)}) (56.90669,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 371, countUnknown = 1, countTodo = 34, timeSkipped = 0.0, timeBuilt = 104.48744599893689, timeUnknown = 0.0, timeTodo = (35.06652069091797,0)}) (58.179817,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 372, countUnknown = 1, countTodo = 33, timeSkipped = 0.0, timeBuilt = 107.11970918253064, timeUnknown = 0.0, timeTodo = (32.46625900268555,0)}) (59.453945,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 380, countUnknown = 1, countTodo = 25, timeSkipped = 0.0, timeBuilt = 109.69997041299939, timeUnknown = 0.0, timeTodo = (29.24193572998047,0)}) (61.17312,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 387, countUnknown = 1, countTodo = 18, timeSkipped = 0.0, timeBuilt = 114.0083963163197, timeUnknown = 0.0, timeTodo = (24.391448974609375,0)}) (62.17922,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 387, countUnknown = 1, countTodo = 18, timeSkipped = 0.0, timeBuilt = 114.0083963163197, timeUnknown = 0.0, timeTodo = (24.391448974609375,0)}) (63.787376,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 390, countUnknown = 1, countTodo = 15, timeSkipped = 0.0, timeBuilt = 116.61765687540174, timeUnknown = 0.0, timeTodo = (21.717185974121094,0)}) (67.25372,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 393, countUnknown = 1, countTodo = 12, timeSkipped = 0.0, timeBuilt = 118.84489076212049, timeUnknown = 0.0, timeTodo = (19.470962524414063,0)}) (69.114914,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 399, countUnknown = 1, countTodo = 6, timeSkipped = 0.0, timeBuilt = 124.42945497110486, timeUnknown = 0.0, timeTodo = (13.024314880371094,0)}) (70.91309,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 402, countUnknown = 1, countTodo = 3, timeSkipped = 0.0, timeBuilt = 127.54076020792127, timeUnknown = 0.0, timeTodo = (10.791084289550781,0)}) (72.67527,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 404, countUnknown = 1, countTodo = 1, timeSkipped = 0.0, timeBuilt = 129.2949411161244, timeUnknown = 0.0, timeTodo = (8.603858947753906,0)}) (73.68037,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 404, countUnknown = 1, countTodo = 1, timeSkipped = 0.0, timeBuilt = 129.2949411161244, timeUnknown = 0.0, timeTodo = (8.603858947753906,0)}) (74.683464,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 404, countUnknown = 1, countTodo = 1, timeSkipped = 0.0, timeBuilt = 129.2949411161244, timeUnknown = 0.0, timeTodo = (8.603858947753906,0)}) (75.68657,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 404, countUnknown = 1, countTodo = 1, timeSkipped = 0.0, timeBuilt = 129.2949411161244, timeUnknown = 0.0, timeTodo = (8.603858947753906,0)}) (76.68967,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 404, countUnknown = 1, countTodo = 1, timeSkipped = 0.0, timeBuilt = 129.2949411161244, timeUnknown = 0.0, timeTodo = (8.603858947753906,0)}) (77.69477,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 404, countUnknown = 1, countTodo = 1, timeSkipped = 0.0, timeBuilt = 129.2949411161244, timeUnknown = 0.0, timeTodo = (8.603858947753906,0)}) shake-0.15.5/src/Test/Progress/self-rebuild-j2.prog0000644000000000000000000002753612560222036020167 0ustar0000000000000000(3.6053605,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 258, countUnknown = 1, countTodo = 147, timeSkipped = 0.0, timeBuilt = 1.054105345858261, timeUnknown = 0.0, timeTodo = (135.01951599121094,0)}) (4.8964896,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 259, countUnknown = 1, countTodo = 146, timeSkipped = 0.0, timeBuilt = 3.6373634978663176, timeUnknown = 0.0, timeTodo = (132.3802523612976,0)}) (6.189619,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 260, countUnknown = 1, countTodo = 145, timeSkipped = 0.0, timeBuilt = 6.2686262771021575, timeUnknown = 0.0, timeTodo = (129.85099935531616,0)}) (7.4677467,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 262, countUnknown = 1, countTodo = 143, timeSkipped = 0.0, timeBuilt = 8.854884450091049, timeUnknown = 0.0, timeTodo = (127.25373983383179,0)}) (8.742874,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 268, countUnknown = 1, countTodo = 137, timeSkipped = 0.0, timeBuilt = 11.444143597735092, timeUnknown = 0.0, timeTodo = (124.70048475265503,0)}) (10.016002,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 271, countUnknown = 1, countTodo = 134, timeSkipped = 0.0, timeBuilt = 16.5466540500056, timeUnknown = 0.0, timeTodo = (119.64597797393799,0)}) (11.30813,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 271, countUnknown = 1, countTodo = 134, timeSkipped = 0.0, timeBuilt = 16.5466540500056, timeUnknown = 0.0, timeTodo = (119.64597797393799,0)}) (12.59526,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 276, countUnknown = 1, countTodo = 129, timeSkipped = 0.0, timeBuilt = 19.123910252703354, timeUnknown = 0.0, timeTodo = (117.0657205581665,0)}) (13.876388,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 278, countUnknown = 1, countTodo = 127, timeSkipped = 0.0, timeBuilt = 21.708168332232162, timeUnknown = 0.0, timeTodo = (114.43845868110657,0)}) (15.173517,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 280, countUnknown = 1, countTodo = 125, timeSkipped = 0.0, timeBuilt = 24.27642471040599, timeUnknown = 0.0, timeTodo = (111.89620327949524,0)}) (16.456646,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 282, countUnknown = 1, countTodo = 123, timeSkipped = 0.0, timeBuilt = 26.85868198121898, timeUnknown = 0.0, timeTodo = (109.3659496307373,0)}) (17.726772,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 284, countUnknown = 1, countTodo = 121, timeSkipped = 0.0, timeBuilt = 29.442940060747787, timeUnknown = 0.0, timeTodo = (106.83069705963135,0)}) (19.027903,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 288, countUnknown = 1, countTodo = 117, timeSkipped = 0.0, timeBuilt = 32.00519592012279, timeUnknown = 0.0, timeTodo = (104.28944301605225,0)}) (20.345034,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 292, countUnknown = 1, countTodo = 113, timeSkipped = 0.0, timeBuilt = 34.586452786577865, timeUnknown = 0.0, timeTodo = (101.74619007110596,0)}) (21.648165,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 300, countUnknown = 1, countTodo = 105, timeSkipped = 0.0, timeBuilt = 37.32872611726634, timeUnknown = 0.0, timeTodo = (99.18993377685547,0)}) (23.717371,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 301, countUnknown = 1, countTodo = 104, timeSkipped = 0.0, timeBuilt = 40.75507003511302, timeUnknown = 0.0, timeTodo = (96.41965675354004,0)}) (25.022503,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 303, countUnknown = 1, countTodo = 102, timeSkipped = 0.0, timeBuilt = 44.12840873445384, timeUnknown = 0.0, timeTodo = (93.89340400695801,0)}) (26.289629,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 304, countUnknown = 1, countTodo = 101, timeSkipped = 0.0, timeBuilt = 46.6976664706599, timeUnknown = 0.0, timeTodo = (91.36415100097656,0)}) (27.576757,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 305, countUnknown = 1, countTodo = 100, timeSkipped = 0.0, timeBuilt = 49.27292472566478, timeUnknown = 0.0, timeTodo = (88.50186347961426,0)}) (29.49495,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 308, countUnknown = 1, countTodo = 97, timeSkipped = 0.0, timeBuilt = 52.47924644197337, timeUnknown = 0.0, timeTodo = (84.93850708007813,0)}) (31.11511,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 311, countUnknown = 1, countTodo = 94, timeSkipped = 0.0, timeBuilt = 56.04560119356029, timeUnknown = 0.0, timeTodo = (81.36915397644043,0)}) (32.42124,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 313, countUnknown = 1, countTodo = 92, timeSkipped = 0.0, timeBuilt = 58.938888852251694, timeUnknown = 0.0, timeTodo = (78.84289932250977,0)}) (33.68837,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 315, countUnknown = 1, countTodo = 90, timeSkipped = 0.0, timeBuilt = 61.51514465059154, timeUnknown = 0.0, timeTodo = (75.96561431884766,0)}) (34.961494,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 322, countUnknown = 1, countTodo = 83, timeSkipped = 0.0, timeBuilt = 64.10740501130931, timeUnknown = 0.0, timeTodo = (73.41636085510254,0)}) (36.245625,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 330, countUnknown = 1, countTodo = 75, timeSkipped = 0.0, timeBuilt = 66.67165786470287, timeUnknown = 0.0, timeTodo = (70.8551082611084,0)}) (37.545753,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 331, countUnknown = 1, countTodo = 74, timeSkipped = 0.0, timeBuilt = 69.22791320527904, timeUnknown = 0.0, timeTodo = (68.3188533782959,0)}) (38.944893,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 336, countUnknown = 1, countTodo = 69, timeSkipped = 0.0, timeBuilt = 71.91318160737865, timeUnknown = 0.0, timeTodo = (65.60857582092285,0)}) (40.345036,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 337, countUnknown = 1, countTodo = 68, timeSkipped = 0.0, timeBuilt = 74.72146255220287, timeUnknown = 0.0, timeTodo = (62.849300384521484,0)}) (41.651165,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 340, countUnknown = 1, countTodo = 65, timeSkipped = 0.0, timeBuilt = 77.43573409761302, timeUnknown = 0.0, timeTodo = (60.147029876708984,0)}) (42.9913,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 345, countUnknown = 1, countTodo = 60, timeSkipped = 0.0, timeBuilt = 80.11199981416576, timeUnknown = 0.0, timeTodo = (57.615779876708984,0)}) (44.28843,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 347, countUnknown = 1, countTodo = 58, timeSkipped = 0.0, timeBuilt = 82.76326781953685, timeUnknown = 0.0, timeTodo = (55.096527099609375,0)}) (45.844585,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 350, countUnknown = 1, countTodo = 55, timeSkipped = 0.0, timeBuilt = 85.57555038179271, timeUnknown = 0.0, timeTodo = (53.285343170166016,0)}) (46.848686,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 354, countUnknown = 1, countTodo = 51, timeSkipped = 0.0, timeBuilt = 88.44083816255443, timeUnknown = 0.0, timeTodo = (50.76008987426758,0)}) (47.852783,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 354, countUnknown = 1, countTodo = 51, timeSkipped = 0.0, timeBuilt = 88.44083816255443, timeUnknown = 0.0, timeTodo = (50.76008987426758,0)}) (50.848083,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 357, countUnknown = 1, countTodo = 48, timeSkipped = 0.0, timeBuilt = 92.16721374238841, timeUnknown = 0.0, timeTodo = (48.103824615478516,0)}) (52.852283,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 358, countUnknown = 1, countTodo = 47, timeSkipped = 0.0, timeBuilt = 95.5145495578181, timeUnknown = 0.0, timeTodo = (45.494564056396484,0)}) (54.130413,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 362, countUnknown = 1, countTodo = 43, timeSkipped = 0.0, timeBuilt = 100.13300735200755, timeUnknown = 0.0, timeTodo = (41.70818328857422,0)}) (55.41054,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 363, countUnknown = 1, countTodo = 42, timeSkipped = 0.0, timeBuilt = 102.67526275361888, timeUnknown = 0.0, timeTodo = (39.17192840576172,0)}) (56.414642,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 369, countUnknown = 1, countTodo = 36, timeSkipped = 0.0, timeBuilt = 105.25251800264232, timeUnknown = 0.0, timeTodo = (36.63368225097656,0)}) (59.073906,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 372, countUnknown = 1, countTodo = 33, timeSkipped = 0.0, timeBuilt = 107.67676574434154, timeUnknown = 0.0, timeTodo = (34.10843276977539,0)}) (60.356033,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 374, countUnknown = 1, countTodo = 31, timeSkipped = 0.0, timeBuilt = 110.21202117647044, timeUnknown = 0.0, timeTodo = (31.546173095703125,0)}) (61.641163,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 375, countUnknown = 1, countTodo = 30, timeSkipped = 0.0, timeBuilt = 114.02240401948802, timeUnknown = 0.0, timeTodo = (28.913909912109375,0)}) (62.92029,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 381, countUnknown = 1, countTodo = 24, timeSkipped = 0.0, timeBuilt = 116.60766250337474, timeUnknown = 0.0, timeTodo = (26.366653442382813,0)}) (65.09751,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 387, countUnknown = 1, countTodo = 18, timeSkipped = 0.0, timeBuilt = 121.40914184297435, timeUnknown = 0.0, timeTodo = (22.06322479248047,0)}) (66.10161,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 387, countUnknown = 1, countTodo = 18, timeSkipped = 0.0, timeBuilt = 121.40914184297435, timeUnknown = 0.0, timeTodo = (22.06322479248047,0)}) (67.678764,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 390, countUnknown = 1, countTodo = 15, timeSkipped = 0.0, timeBuilt = 123.98239928926341, timeUnknown = 0.0, timeTodo = (19.453964233398438,0)}) (71.20812,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 393, countUnknown = 1, countTodo = 12, timeSkipped = 0.0, timeBuilt = 126.19761687959544, timeUnknown = 0.0, timeTodo = (17.226730346679688,0)}) (72.21122,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 396, countUnknown = 1, countTodo = 9, timeSkipped = 0.0, timeBuilt = 128.785888974322, timeUnknown = 0.0, timeTodo = (14.163421630859375,0)}) (73.53935,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 399, countUnknown = 1, countTodo = 6, timeSkipped = 0.0, timeBuilt = 133.6913836642634, timeUnknown = 0.0, timeTodo = (11.642166137695313,0)}) (75.757576,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 402, countUnknown = 1, countTodo = 3, timeSkipped = 0.0, timeBuilt = 135.9266131564509, timeUnknown = 0.0, timeTodo = (8.530860900878906,0)}) (77.91779,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 404, countUnknown = 1, countTodo = 1, timeSkipped = 0.0, timeBuilt = 138.08883697236888, timeUnknown = 0.0, timeTodo = (6.776679992675781,0)}) (78.92289,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 404, countUnknown = 1, countTodo = 1, timeSkipped = 0.0, timeBuilt = 138.08883697236888, timeUnknown = 0.0, timeTodo = (6.776679992675781,0)}) (79.925995,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 404, countUnknown = 1, countTodo = 1, timeSkipped = 0.0, timeBuilt = 138.08883697236888, timeUnknown = 0.0, timeTodo = (6.776679992675781,0)}) (80.93109,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 404, countUnknown = 1, countTodo = 1, timeSkipped = 0.0, timeBuilt = 138.08883697236888, timeUnknown = 0.0, timeTodo = (6.776679992675781,0)}) (81.93419,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 404, countUnknown = 1, countTodo = 1, timeSkipped = 0.0, timeBuilt = 138.08883697236888, timeUnknown = 0.0, timeTodo = (6.776679992675781,0)}) (82.93929,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 404, countUnknown = 1, countTodo = 1, timeSkipped = 0.0, timeBuilt = 138.08883697236888, timeUnknown = 0.0, timeTodo = (6.776679992675781,0)}) shake-0.15.5/src/Test/Progress/self-clean-j2.prog0000644000000000000000000003047412560222036017616 0ustar0000000000000000(1.120224,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 254, countUnknown = 0, countTodo = 145, timeSkipped = 0.0, timeBuilt = 0.8861771221272647, timeUnknown = 0.0, timeTodo = (0.0,145)}) (2.4994998,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 258, countUnknown = 0, countTodo = 147, timeSkipped = 0.0, timeBuilt = 1.259251721901819, timeUnknown = 0.0, timeTodo = (0.0,147)}) (5.156031,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 259, countUnknown = 0, countTodo = 146, timeSkipped = 0.0, timeBuilt = 3.8557707152795047, timeUnknown = 0.0, timeTodo = (0.0,146)}) (6.4472423,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 261, countUnknown = 0, countTodo = 144, timeSkipped = 0.0, timeBuilt = 6.519303449196741, timeUnknown = 0.0, timeTodo = (0.0,144)}) (7.737371,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 262, countUnknown = 0, countTodo = 143, timeSkipped = 0.0, timeBuilt = 9.126777537865564, timeUnknown = 0.0, timeTodo = (0.0,143)}) (9.061503,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 267, countUnknown = 0, countTodo = 138, timeSkipped = 0.0, timeBuilt = 11.739121564431116, timeUnknown = 0.0, timeTodo = (0.0,138)}) (10.368634,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 268, countUnknown = 0, countTodo = 137, timeSkipped = 0.0, timeBuilt = 14.354383119149134, timeUnknown = 0.0, timeTodo = (0.0,137)}) (11.658763,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 272, countUnknown = 0, countTodo = 133, timeSkipped = 0.0, timeBuilt = 16.976646550698206, timeUnknown = 0.0, timeTodo = (0.0,133)}) (12.941892,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 274, countUnknown = 0, countTodo = 131, timeSkipped = 0.0, timeBuilt = 19.57990659098141, timeUnknown = 0.0, timeTodo = (0.0,131)}) (14.22602,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 279, countUnknown = 0, countTodo = 126, timeSkipped = 0.0, timeBuilt = 22.17116750101559, timeUnknown = 0.0, timeTodo = (0.0,126)}) (15.511148,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 282, countUnknown = 0, countTodo = 123, timeSkipped = 0.0, timeBuilt = 24.74242699961178, timeUnknown = 0.0, timeTodo = (0.0,123)}) (16.794277,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 283, countUnknown = 0, countTodo = 122, timeSkipped = 0.0, timeBuilt = 27.31368459085934, timeUnknown = 0.0, timeTodo = (0.0,122)}) (18.085405,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 288, countUnknown = 0, countTodo = 117, timeSkipped = 0.0, timeBuilt = 29.8999453864526, timeUnknown = 0.0, timeTodo = (0.0,117)}) (19.398537,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 294, countUnknown = 0, countTodo = 111, timeSkipped = 0.0, timeBuilt = 32.487208493752405, timeUnknown = 0.0, timeTodo = (0.0,111)}) (21.451742,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 295, countUnknown = 0, countTodo = 110, timeSkipped = 0.0, timeBuilt = 35.872547277016565, timeUnknown = 0.0, timeTodo = (0.0,110)}) (22.761873,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 297, countUnknown = 0, countTodo = 108, timeSkipped = 0.0, timeBuilt = 39.24288571695797, timeUnknown = 0.0, timeTodo = (0.0,108)}) (24.060003,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 298, countUnknown = 0, countTodo = 107, timeSkipped = 0.0, timeBuilt = 41.825143941445276, timeUnknown = 0.0, timeTodo = (0.0,107)}) (25.555153,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 304, countUnknown = 0, countTodo = 101, timeSkipped = 0.0, timeBuilt = 44.66642964701168, timeUnknown = 0.0, timeTodo = (0.0,101)}) (26.868284,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 305, countUnknown = 0, countTodo = 100, timeSkipped = 0.0, timeBuilt = 47.460708745522425, timeUnknown = 0.0, timeTodo = (0.0,100)}) (28.22642,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 306, countUnknown = 0, countTodo = 99, timeSkipped = 0.0, timeBuilt = 50.111974843544886, timeUnknown = 0.0, timeTodo = (0.0,99)}) (29.52355,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 308, countUnknown = 0, countTodo = 97, timeSkipped = 0.0, timeBuilt = 52.788244374794886, timeUnknown = 0.0, timeTodo = (0.0,97)}) (30.82868,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 309, countUnknown = 0, countTodo = 96, timeSkipped = 0.0, timeBuilt = 55.38850606302731, timeUnknown = 0.0, timeTodo = (0.0,96)}) (31.83378,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 309, countUnknown = 0, countTodo = 96, timeSkipped = 0.0, timeBuilt = 55.38850606302731, timeUnknown = 0.0, timeTodo = (0.0,96)}) (33.099907,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 311, countUnknown = 0, countTodo = 94, timeSkipped = 0.0, timeBuilt = 57.70873845438473, timeUnknown = 0.0, timeTodo = (0.0,94)}) (34.455044,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 312, countUnknown = 0, countTodo = 93, timeSkipped = 0.0, timeBuilt = 62.583227284951136, timeUnknown = 0.0, timeTodo = (0.0,93)}) (35.775173,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 323, countUnknown = 0, countTodo = 82, timeSkipped = 0.0, timeBuilt = 65.22149098734371, timeUnknown = 0.0, timeTodo = (0.0,82)}) (37.087307,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 324, countUnknown = 0, countTodo = 81, timeSkipped = 0.0, timeBuilt = 67.89576161722653, timeUnknown = 0.0, timeTodo = (0.0,81)}) (38.39644,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 332, countUnknown = 0, countTodo = 73, timeSkipped = 0.0, timeBuilt = 70.54603017191403, timeUnknown = 0.0, timeTodo = (0.0,73)}) (39.705566,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 336, countUnknown = 0, countTodo = 69, timeSkipped = 0.0, timeBuilt = 73.18129933695309, timeUnknown = 0.0, timeTodo = (0.0,69)}) (41.676765,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 339, countUnknown = 0, countTodo = 66, timeSkipped = 0.0, timeBuilt = 76.48663152079098, timeUnknown = 0.0, timeTodo = (0.0,66)}) (42.969894,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 342, countUnknown = 0, countTodo = 63, timeSkipped = 0.0, timeBuilt = 79.77195943216793, timeUnknown = 0.0, timeTodo = (0.0,63)}) (44.364033,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 345, countUnknown = 0, countTodo = 60, timeSkipped = 0.0, timeBuilt = 82.46222890238278, timeUnknown = 0.0, timeTodo = (0.0,60)}) (45.367134,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 348, countUnknown = 0, countTodo = 57, timeSkipped = 0.0, timeBuilt = 85.15849507669918, timeUnknown = 0.0, timeTodo = (0.0,57)}) (47.0483,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 351, countUnknown = 0, countTodo = 54, timeSkipped = 0.0, timeBuilt = 87.8587666831445, timeUnknown = 0.0, timeTodo = (0.0,54)}) (48.052402,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 351, countUnknown = 0, countTodo = 54, timeSkipped = 0.0, timeBuilt = 87.8587666831445, timeUnknown = 0.0, timeTodo = (0.0,54)}) (49.890587,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 354, countUnknown = 0, countTodo = 51, timeSkipped = 0.0, timeBuilt = 90.69304478983395, timeUnknown = 0.0, timeTodo = (0.0,51)}) (50.894688,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 354, countUnknown = 0, countTodo = 51, timeSkipped = 0.0, timeBuilt = 90.69304478983395, timeUnknown = 0.0, timeTodo = (0.0,51)}) (51.897785,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 354, countUnknown = 0, countTodo = 51, timeSkipped = 0.0, timeBuilt = 90.69304478983395, timeUnknown = 0.0, timeTodo = (0.0,51)}) (54.92909,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 357, countUnknown = 0, countTodo = 48, timeSkipped = 0.0, timeBuilt = 94.47342885355465, timeUnknown = 0.0, timeTodo = (0.0,48)}) (56.23222,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 358, countUnknown = 0, countTodo = 47, timeSkipped = 0.0, timeBuilt = 97.04368794779293, timeUnknown = 0.0, timeTodo = (0.0,47)}) (57.53035,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 360, countUnknown = 0, countTodo = 45, timeSkipped = 0.0, timeBuilt = 100.92107594828121, timeUnknown = 0.0, timeTodo = (0.0,45)}) (58.82748,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 361, countUnknown = 0, countTodo = 44, timeSkipped = 0.0, timeBuilt = 103.52433598856442, timeUnknown = 0.0, timeTodo = (0.0,44)}) (60.104607,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 368, countUnknown = 0, countTodo = 37, timeSkipped = 0.0, timeBuilt = 106.14560521463864, timeUnknown = 0.0, timeTodo = (0.0,37)}) (62.0268,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 369, countUnknown = 0, countTodo = 36, timeSkipped = 0.0, timeBuilt = 109.38093007425778, timeUnknown = 0.0, timeTodo = (0.0,36)}) (63.33293,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 372, countUnknown = 0, countTodo = 33, timeSkipped = 0.0, timeBuilt = 112.58624852518551, timeUnknown = 0.0, timeTodo = (0.0,33)}) (64.62606,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 374, countUnknown = 0, countTodo = 31, timeSkipped = 0.0, timeBuilt = 115.16150487284176, timeUnknown = 0.0, timeTodo = (0.0,31)}) (65.925186,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 379, countUnknown = 0, countTodo = 26, timeSkipped = 0.0, timeBuilt = 117.77476704935543, timeUnknown = 0.0, timeTodo = (0.0,26)}) (68.11641,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 387, countUnknown = 0, countTodo = 18, timeSkipped = 0.0, timeBuilt = 122.62825597147457, timeUnknown = 0.0, timeTodo = (0.0,18)}) (69.121506,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 387, countUnknown = 0, countTodo = 18, timeSkipped = 0.0, timeBuilt = 122.62825597147457, timeUnknown = 0.0, timeTodo = (0.0,18)}) (70.77867,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 390, countUnknown = 0, countTodo = 15, timeSkipped = 0.0, timeBuilt = 125.30251897196285, timeUnknown = 0.0, timeTodo = (0.0,15)}) (74.353035,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 393, countUnknown = 0, countTodo = 12, timeSkipped = 0.0, timeBuilt = 127.54874242166989, timeUnknown = 0.0, timeTodo = (0.0,12)}) (75.35613,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 396, countUnknown = 0, countTodo = 9, timeSkipped = 0.0, timeBuilt = 130.15401280741207, timeUnknown = 0.0, timeTodo = (0.0,9)}) (76.85028,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 399, countUnknown = 0, countTodo = 6, timeSkipped = 0.0, timeBuilt = 133.99539006571285, timeUnknown = 0.0, timeTodo = (0.0,6)}) (79.081505,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 402, countUnknown = 0, countTodo = 3, timeSkipped = 0.0, timeBuilt = 136.22862065653317, timeUnknown = 0.0, timeTodo = (0.0,3)}) (81.25272,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 404, countUnknown = 0, countTodo = 1, timeSkipped = 0.0, timeBuilt = 138.41584599833004, timeUnknown = 0.0, timeTodo = (0.0,1)}) (82.260826,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 404, countUnknown = 0, countTodo = 1, timeSkipped = 0.0, timeBuilt = 138.41584599833004, timeUnknown = 0.0, timeTodo = (0.0,1)}) (83.26492,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 404, countUnknown = 0, countTodo = 1, timeSkipped = 0.0, timeBuilt = 138.41584599833004, timeUnknown = 0.0, timeTodo = (0.0,1)}) (84.27003,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 404, countUnknown = 0, countTodo = 1, timeSkipped = 0.0, timeBuilt = 138.41584599833004, timeUnknown = 0.0, timeTodo = (0.0,1)}) (85.274124,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 404, countUnknown = 0, countTodo = 1, timeSkipped = 0.0, timeBuilt = 138.41584599833004, timeUnknown = 0.0, timeTodo = (0.0,1)}) (86.28023,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 404, countUnknown = 0, countTodo = 1, timeSkipped = 0.0, timeBuilt = 138.41584599833004, timeUnknown = 0.0, timeTodo = (0.0,1)}) (87.284325,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 404, countUnknown = 0, countTodo = 1, timeSkipped = 0.0, timeBuilt = 138.41584599833004, timeUnknown = 0.0, timeTodo = (0.0,1)}) (88.28843,Progress {isFailure = Nothing, countSkipped = 0, countBuilt = 404, countUnknown = 0, countTodo = 1, timeSkipped = 0.0, timeBuilt = 138.41584599833004, timeUnknown = 0.0, timeTodo = (0.0,1)}) shake-0.15.5/src/Test/Progress/progress-nan.prog0000644000000000000000000000137412560222036017707 0ustar0000000000000000(5.215521335601807,Progress {isFailure = Nothing, countSkipped = 5028, countBuilt = 0, countUnknown = 36224, countTodo = 1837, timeSkipped = 274149.05088049243, timeBuilt = 0.0, timeUnknown = 31144.710513075814, timeTodo = (4301.8477063977625,1)}) (12.997299194335938,Progress {isFailure = Nothing, countSkipped = 12154, countBuilt = 0, countUnknown = 23682, countTodo = 7253, timeSkipped = 274894.82577174786, timeBuilt = 0.0, timeUnknown = 23933.863898285897, timeTodo = (10766.919429932255,1)}) (18.471847534179688,Progress {isFailure = Nothing, countSkipped = 18273, countBuilt = 27, countUnknown = 13862, countTodo = 10931, timeSkipped = 275406.36595057615, timeBuilt = 1.7951794527471066, timeUnknown = 17710.551982904668, timeTodo = (16478.41014388157,4)}) shake-0.15.5/src/Test/Ninja/0000755000000000000000000000000012560222036013626 5ustar0000000000000000shake-0.15.5/src/Test/Ninja/test6.ninja0000644000000000000000000000007112560222036015712 0ustar0000000000000000v2 = g1 include ${v1}-inc.ninja subninja ${v1}-sub.ninja shake-0.15.5/src/Test/Ninja/test6-sub.ninja0000644000000000000000000000001012560222036016472 0ustar0000000000000000v2 = g3 shake-0.15.5/src/Test/Ninja/test6-inc.ninja0000644000000000000000000000001012560222036016452 0ustar0000000000000000v2 = g2 shake-0.15.5/src/Test/Ninja/test5.ninja0000644000000000000000000000007412560222036015714 0ustar0000000000000000 rule run command = touch $out build output$ file: run shake-0.15.5/src/Test/Ninja/test4.ninja0000644000000000000000000000017212560222036015712 0ustar0000000000000000 rule run command = touch $out build ./out.txt: run build dir/../out2.txt: run build out: phony ./out.txt out2.txt shake-0.15.5/src/Test/Ninja/test3.ninja0000644000000000000000000000034512560222036015713 0ustar0000000000000000 v1 = g1 v5 = g1 rule dump command = echo $v1+$v2+$v3+$v4+$v5 > $out v1 = g2 build out3.1: dump v2 = b1 v1 = g3 subninja test3-sub.ninja include test3-inc.ninja build out3.2: dump v1 = g4 include subdir/1.ninja shake-0.15.5/src/Test/Ninja/test3-sub.ninja0000644000000000000000000000005512560222036016500 0ustar0000000000000000v4 = s1 v5 = s1 build out3.4: dump v5 = s2 shake-0.15.5/src/Test/Ninja/test3-inc.ninja0000644000000000000000000000003412560222036016455 0ustar0000000000000000v5 = i1 build out3.3: dump shake-0.15.5/src/Test/Ninja/test2.ninja0000644000000000000000000000013012560222036015702 0ustar0000000000000000 rule run command = touch $out build out2.1: run build out2.2: run default out2.2 shake-0.15.5/src/Test/Ninja/test1.ninja0000644000000000000000000000007012560222036015704 0ustar0000000000000000 rule run command = touch $out build out1.txt: run shake-0.15.5/src/Test/Ninja/redefine.ninja0000644000000000000000000000031512560222036016427 0ustar0000000000000000 vGlobal = version1 rule record # should produce version3 version2 command = echo $vGlobal $vBuild > $out vGlobal = version2 build redefine.txt: record vBuild = $vGlobal vGlobal = version3 shake-0.15.5/src/Test/Ninja/phonyorder.ninja0000644000000000000000000000025412560222036017041 0ustar0000000000000000 rule create command = touch $out rule copy command = cp $from $out build Foo2: phony || foo.txt build bar.txt: copy || Foo2 from = foo.txt build foo.txt: create shake-0.15.5/src/Test/Ninja/outputtouch.ninja0000644000000000000000000000011412560222036017246 0ustar0000000000000000 rule record command = echo hello > $out build outputtouch.txt: record shake-0.15.5/src/Test/Ninja/nocreate.ninja0000644000000000000000000000012412560222036016444 0ustar0000000000000000 rule gen command = echo x >> nocreate.log build nocreate.out: gen nocreate.in shake-0.15.5/src/Test/Ninja/lint.ninja0000644000000000000000000000036012560222036015614 0ustar0000000000000000 rule gen command = echo $out > $out rule run command = (echo $out : $out.gen > $out.d) && (echo $out > $out) depfile = $out.d build good: run || good.gen bad.gen build good.gen: gen build bad: run | input build bad.gen: gen shake-0.15.5/src/Test/Ninja/compdb.output0000644000000000000000000000204212560222036016352 0ustar0000000000000000[ { "directory": "*", "command": "g++ -MMD -MT build\\build.o -MF build\\build.o.d -g -Wall -Wextra -Wno-deprecated -Wno-unused-parameter -fno-rtti -fno-exceptions -pipe -Wno-missing-field-initializers -DNINJA_PYTHON=\"python.exe\" -O2 -DNDEBUG -D_WIN32_WINNT=0x0501 -c src\\build.cc -o build\\build.o", "file": "src\\build.cc" }, { "directory": "*", "command": "g++ -MMD -MT build\\build_log.o -MF build\\build_log.o.d -g -Wall -Wextra -Wno-deprecated -Wno-unused-parameter -fno-rtti -fno-exceptions -pipe -Wno-missing-field-initializers -DNINJA_PYTHON=\"python.exe\" -O2 -DNDEBUG -D_WIN32_WINNT=0x0501 -c src\\build_log.cc -o build\\build_log.o", "file": "src\\build_log.cc" }, { "directory": "*", "command": "g++ -MMD -MT build\\clean.o -MF build\\clean.o.d -g -Wall -Wextra -Wno-deprecated -Wno-unused-parameter -fno-rtti -fno-exceptions -pipe -Wno-missing-field-initializers -DNINJA_PYTHON=\"python.exe\" -O2 -DNDEBUG -D_WIN32_WINNT=0x0501 -c src\\clean.cc -o build\\clean.o", "file": "src\\clean.cc" } ] shake-0.15.5/src/Test/Ninja/compdb.ninja0000644000000000000000000000244312560222036016116 0ustar0000000000000000# Copied from the Ninja repo ninja_required_version = 1.3 builddir = build cxx = g++ ar = ar cflags = -g -Wall -Wextra -Wno-deprecated -Wno-unused-parameter -fno-rtti $ -fno-exceptions -pipe -Wno-missing-field-initializers $ -DNINJA_PYTHON="python.exe" -O2 -DNDEBUG -D_WIN32_WINNT=0x0501 ldflags = -L$builddir -static rule cxx command = $cxx -MMD -MT $out -MF $out.d $cflags -c $in -o $out description = CXX $out depfile = $out.d deps = gcc rule ar command = cmd /c $ar cqs $out.tmp $in && move /Y $out.tmp $out description = AR $out rule link command = $cxx $ldflags -o $out $in $libs description = LINK $out # the depfile parser and ninja lexers are generated using re2c. # Core source files all build into ninja library. build $builddir\build.o: cxx src\build.cc build $builddir\build_log.o: cxx src\build_log.cc build $builddir\clean.o: cxx src\clean.cc rule doxygen command = doxygen $in description = DOXYGEN $in doxygen_mainpage_generator = src\gen_doxygen_mainpage.sh rule doxygen_mainpage command = $doxygen_mainpage_generator $in > $out description = DOXYGEN_MAINPAGE $out build $builddir\doxygen_mainpage: doxygen_mainpage README COPYING | $ $doxygen_mainpage_generator build doxygen: doxygen doc\doxygen.config | $builddir\doxygen_mainpage default ninja.exe shake-0.15.5/src/Test/Ninja/buildseparate.ninja0000644000000000000000000000026712560222036017500 0ustar0000000000000000 rule record # produces XX command = hello command = echo $vBar $command > $out build buildseparate.txt: record vFoo = foo vBar = bar vBar = X${vFoo}${vBar}X shake-0.15.5/src/Test/Ninja/subdir/0000755000000000000000000000000012560222036015116 5ustar0000000000000000shake-0.15.5/src/Test/Ninja/subdir/2.ninja0000644000000000000000000000000012560222036016266 0ustar0000000000000000shake-0.15.5/src/Test/Ninja/subdir/1.ninja0000644000000000000000000000012712560222036016277 0ustar0000000000000000# weirdly, Ninja includes are not relative to who includes them include subdir/2.ninja shake-0.15.5/src/Test/MakeTutor/0000755000000000000000000000000012560222036014502 5ustar0000000000000000shake-0.15.5/src/Test/MakeTutor/Makefile0000644000000000000000000000045112560222036016142 0ustar0000000000000000# From http://www.cs.colby.edu/maxwell/courses/tutorials/maketutor/, Makefile 4 CC=gcc CFLAGS=-I. DEPS = hellomake.h OBJ = hellomake.o hellofunc.o hellomake$(EXE): $(OBJ) $(CC) -o $@ $^ $(CFLAGS) %.o: %.c $(DEPS) $(CC) -c -o $@ $< $(CFLAGS) .PHONY: clean clean: rm hellomake$(EXE) rm *.o shake-0.15.5/src/Test/MakeTutor/hellomake.h0000644000000000000000000000007112560222036016612 0ustar0000000000000000/* example include file */ void myPrintHelloMake(void); shake-0.15.5/src/Test/MakeTutor/hellomake.c0000644000000000000000000000016012560222036016604 0ustar0000000000000000#include int main() { // call a function in another file myPrintHelloMake(); return(0); } shake-0.15.5/src/Test/MakeTutor/hellofunc.c0000644000000000000000000000016712560222036016631 0ustar0000000000000000#include #include void myPrintHelloMake(void) { printf("Hello makefiles!\n"); return; } shake-0.15.5/src/Test/C/0000755000000000000000000000000012560222036012751 5ustar0000000000000000shake-0.15.5/src/Test/C/main.c0000644000000000000000000000014112560222036014035 0ustar0000000000000000#include #include "constants.h" int main() { printf("%s\n", message()); return 0; } shake-0.15.5/src/Test/C/constants.h0000644000000000000000000000002112560222036015127 0ustar0000000000000000char* message(); shake-0.15.5/src/Test/C/constants.c0000644000000000000000000000010612560222036015126 0ustar0000000000000000 char msg[] = "Hello Shake Users!"; char* message() { return msg; } shake-0.15.5/src/General/0000755000000000000000000000000012560222036013225 5ustar0000000000000000shake-0.15.5/src/General/Timing.hs0000644000000000000000000000330612560222036015012 0ustar0000000000000000 module General.Timing(resetTimings, addTiming, printTimings) where import Data.IORef import Data.Time import System.IO.Unsafe import Numeric.Extra import System.Time.Extra {-# NOINLINE timings #-} timings :: IORef [(UTCTime, String)] -- number of times called, newest first timings = unsafePerformIO $ newIORef [] resetTimings :: IO () resetTimings = do now <- getCurrentTime writeIORef timings [(now, "Start")] -- | Print all withTiming information and clear the information. printTimings :: IO () printTimings = do now <- getCurrentTime old <- atomicModifyIORef timings $ \ts -> ([(now, "Start")], ts) putStr $ unlines $ showTimings now $ reverse old addTiming :: String -> IO () addTiming msg = do now <- getCurrentTime atomicModifyIORef timings $ \ts -> ((now,msg):ts, ()) showTimings :: UTCTime -> [(UTCTime, String)] -> [String] showTimings _ [] = [] showTimings stop times = showGap $ [(a ++ " ", showDP 3 b ++ "s " ++ showPerc b ++ " " ++ progress b) | (a,b) <- xs] ++ [("Total", showDP 3 sm ++ "s " ++ showPerc sm ++ " " ++ replicate 25 ' ')] where a // b = if b == 0 then 0 else a / b showPerc x = let s = show $ floor $ x * 100 // sm in replicate (3 - length s) ' ' ++ s ++ "%" progress x = let i = floor $ x * 25 // mx in replicate i '=' ++ replicate (25-i) ' ' mx = maximum $ map snd xs sm = sum $ map snd xs xs = [ (name, stop `subtractTime` start) | ((start, name), stop) <- zip times $ map fst (drop 1 times) ++ [stop]] showGap :: [(String,String)] -> [String] showGap xs = [a ++ replicate (n - length a - length b) ' ' ++ b | (a,b) <- xs] where n = maximum [length a + length b | (a,b) <- xs] shake-0.15.5/src/General/Template.hs0000644000000000000000000000355012560222036015337 0ustar0000000000000000{-# LANGUAGE ViewPatterns #-} module General.Template(runTemplate) where import System.FilePath.Posix import Control.Monad import Control.Monad.IO.Class import Data.Char import qualified Data.ByteString.Lazy.Char8 as LBS import qualified Language.Javascript.Flot as Flot import qualified Language.Javascript.JQuery as JQuery libraries = [("jquery.js", JQuery.file) ,("jquery.flot.js", Flot.file Flot.Flot) ,("jquery.flot.stack.js", Flot.file Flot.FlotStack) ] -- | Template Engine. Perform the following replacements on a line basis: -- -- * ==> -- -- * ==> runTemplate :: MonadIO m => (FilePath -> m LBS.ByteString) -> LBS.ByteString -> m LBS.ByteString runTemplate ask = liftM LBS.unlines . mapM f . LBS.lines where link = LBS.pack "\n" `LBS.append` res `LBS.append` LBS.pack "\n" | Just file <- lbs_stripPrefix link y = do res <- grab file; return $ LBS.pack "" | otherwise = return x where y = LBS.dropWhile isSpace x grab = asker . takeWhile (/= '\"') . LBS.unpack asker o@(splitFileName -> ("lib/",x)) = case lookup x libraries of Just act -> liftIO $ LBS.readFile =<< act Nothing -> error $ "Template library, unknown library: " ++ o asker x = ask x lbs_stripPrefix :: LBS.ByteString -> LBS.ByteString -> Maybe LBS.ByteString lbs_stripPrefix prefix text = if a == prefix then Just b else Nothing where (a,b) = LBS.splitAt (LBS.length prefix) text shake-0.15.5/src/General/String.hs0000644000000000000000000000272112560222036015031 0ustar0000000000000000{-# LANGUAGE GeneralizedNewtypeDeriving #-} module General.String( BS, pack, unpack, pack_, unpack_, BSU, packU, unpackU, packU_, unpackU_, requireU ) where import qualified Data.ByteString as BS (any) import qualified Data.ByteString.Char8 as BS hiding (any) import qualified Data.ByteString.UTF8 as UTF8 import Development.Shake.Classes --------------------------------------------------------------------- -- Data.ByteString -- Mostly because ByteString does not have an NFData instance in older GHC -- | ASCII ByteString newtype BS = BS BS.ByteString deriving (Hashable, Binary, Eq) instance Show BS where show (BS x) = show x instance NFData BS where -- some versions of ByteString do not have NFData instances, but seq is equivalent -- for a strict bytestring. Therefore, we write our own instance. rnf (BS x) = x `seq` () -- | UTF8 ByteString newtype BSU = BSU BS.ByteString deriving (Hashable, Binary, Eq) instance NFData BSU where rnf (BSU x) = x `seq` () pack :: String -> BS pack = pack_ . BS.pack unpack :: BS -> String unpack = BS.unpack . unpack_ pack_ :: BS.ByteString -> BS pack_ = BS unpack_ :: BS -> BS.ByteString unpack_ (BS x) = x packU :: String -> BSU packU = packU_ . UTF8.fromString unpackU :: BSU -> String unpackU = UTF8.toString . unpackU_ unpackU_ :: BSU -> BS.ByteString unpackU_ (BSU x) = x packU_ :: BS.ByteString -> BSU packU_ = BSU requireU :: BSU -> Bool requireU = BS.any (>= 0x80) . unpackU_ shake-0.15.5/src/General/Process.hs0000644000000000000000000002002712560222036015200 0ustar0000000000000000{-# LANGUAGE RecordWildCards #-} -- | A wrapping of createProcess to provide a more flexible interface. module General.Process( Buffer, newBuffer, readBuffer, process, ProcessOpts(..), Destination(..) ) where import Control.Applicative import Control.Concurrent import Control.DeepSeq import Control.Exception.Extra as C import Control.Monad.Extra import Data.List.Extra import Data.Maybe import Foreign.C.Error import System.Exit import System.IO.Extra import System.Info.Extra import System.Process import System.Time.Extra import Data.Unique import Data.IORef import qualified Data.ByteString.Char8 as BS import qualified Data.ByteString.Internal as BS(createAndTrim) import qualified Data.ByteString.Lazy as LBS import General.Extra import Prelude import GHC.IO.Exception (IOErrorType(..), IOException(..)) --------------------------------------------------------------------- -- BUFFER ABSTRACTION data Buffer a = Buffer Unique (IORef [a]) instance Eq (Buffer a) where Buffer x _ == Buffer y _ = x == y instance Ord (Buffer a) where compare (Buffer x _) (Buffer y _) = compare x y newBuffer :: IO (Buffer a) newBuffer = liftM2 Buffer newUnique (newIORef []) addBuffer :: Buffer a -> a -> IO () addBuffer (Buffer _ ref) x = atomicModifyIORef ref $ \xs -> (x:xs, ()) readBuffer :: Buffer a -> IO [a] readBuffer (Buffer _ ref) = reverse <$> readIORef ref --------------------------------------------------------------------- -- OPTIONS data Destination = DestEcho | DestFile FilePath | DestString (Buffer String) | DestBytes (Buffer BS.ByteString) deriving (Eq,Ord) isDestString DestString{} = True; isDestString _ = False isDestBytes DestBytes{} = True; isDestBytes _ = False data ProcessOpts = ProcessOpts {poCommand :: CmdSpec ,poCwd :: Maybe FilePath ,poEnv :: Maybe [(String, String)] ,poTimeout :: Maybe Double ,poStdin :: Either String LBS.ByteString ,poStdout :: [Destination] ,poStderr :: [Destination] ,poAsync :: Bool } --------------------------------------------------------------------- -- IMPLEMENTATION -- | If two buffers can be replaced by one and a copy, do that (only if they start empty) optimiseBuffers :: ProcessOpts -> IO (ProcessOpts, IO ()) optimiseBuffers po@ProcessOpts{..} = return (po{poStdout = nubOrd poStdout, poStderr = nubOrd poStderr}, return ()) stdStream :: (FilePath -> Handle) -> [Destination] -> [Destination] -> StdStream stdStream file [DestEcho] other = Inherit stdStream file [DestFile x] other | other == [DestFile x] || DestFile x `notElem` other = UseHandle $ file x stdStream file _ _ = CreatePipe stdIn :: Either String LBS.ByteString -> (StdStream, Handle -> IO ()) stdIn inp | either null LBS.null inp = (Inherit, const $ return ()) | otherwise = (,) CreatePipe $ \h -> void $ tryBool isPipeGone $ do either (hPutStr h) (LBS.hPutStr h) inp hFlush h hClose h where isPipeGone IOError{ioe_type=ResourceVanished, ioe_errno=Just ioe} = Errno ioe == ePIPE isPipeGone _ = False withTimeout :: Maybe Double -> IO () -> IO a -> IO a withTimeout Nothing stop go = go withTimeout (Just s) stop go = bracket (forkIO $ sleep s >> stop) killThread $ const go cmdSpec :: CmdSpec -> CreateProcess cmdSpec (ShellCommand x) = shell x cmdSpec (RawCommand x xs) = proc x xs forkWait :: IO a -> IO (IO a) forkWait a = do res <- newEmptyMVar _ <- mask $ \restore -> forkIO $ try_ (restore a) >>= putMVar res return $ takeMVar res >>= either throwIO return abort :: ProcessHandle -> IO () abort pid = do interruptProcessGroupOf pid sleep 5 -- give the process a few seconds grace period to die nicely -- seems to happen with some GHC 7.2 compiled binaries with FFI etc terminateProcess pid withCreateProcess :: CreateProcess -> ((Maybe Handle, Maybe Handle, Maybe Handle, ProcessHandle) -> IO a) -> IO a withCreateProcess cp act = mask $ \restore -> do ans@(inh, outh, errh, pid) <- createProcess cp onException (restore $ act ans) $ do mapM_ (`whenJust` hClose) [inh, outh, errh] ignore $ do -- sometimes we fail before the process is valid -- therefore if terminate process fails, skip waiting on the process terminateProcess pid void $ waitForProcess pid -- General approach taken from readProcessWithExitCode process :: ProcessOpts -> IO (ProcessHandle, ExitCode) process po = do (ProcessOpts{..}, flushBuffers) <- optimiseBuffers po let files = nubOrd [x | DestFile x <- poStdout ++ poStderr] withs (map (`withFile` WriteMode) files) $ \handles -> do let fileHandle x = fromJust $ lookup x $ zip files handles let cp = (cmdSpec poCommand){cwd = poCwd, env = poEnv, create_group = isJust poTimeout, close_fds = True ,std_in = fst $ stdIn poStdin ,std_out = stdStream fileHandle poStdout poStderr, std_err = stdStream fileHandle poStderr poStdout} withCreateProcess cp $ \(inh, outh, errh, pid) -> do withTimeout poTimeout (abort pid) $ do let streams = [(outh, stdout, poStdout) | Just outh <- [outh], CreatePipe <- [std_out cp]] ++ [(errh, stderr, poStderr) | Just errh <- [errh], CreatePipe <- [std_err cp]] wait <- forM streams $ \(h, hh, dest) -> do let isTied = not $ poStdout `disjoint` poStderr let isBinary = not $ any isDestString dest && not (any isDestBytes dest) when isTied $ hSetBuffering h LineBuffering when (DestEcho `elem` dest) $ do buf <- hGetBuffering hh case buf of BlockBuffering{} -> return () _ -> hSetBuffering h buf if isBinary then do hSetBinaryMode h True dest <- return $ for dest $ \d -> case d of DestEcho -> BS.hPut hh DestFile x -> BS.hPut (fileHandle x) DestString x -> addBuffer x . (if isWindows then replace "\r\n" "\n" else id) . BS.unpack DestBytes x -> addBuffer x forkWait $ whileM $ do src <- bs_hGetSome h 4096 mapM_ ($ src) dest notM $ hIsEOF h else if isTied then do dest <- return $ for dest $ \d -> case d of DestEcho -> hPutStrLn hh DestFile x -> hPutStrLn (fileHandle x) DestString x -> addBuffer x . (++ "\n") forkWait $ whileM $ ifM (hIsEOF h) (return False) $ do src <- hGetLine h mapM_ ($ src) dest return True else do src <- hGetContents h wait1 <- forkWait $ C.evaluate $ rnf src waits <- forM dest $ \d -> case d of DestEcho -> forkWait $ hPutStr hh src DestFile x -> forkWait $ hPutStr (fileHandle x) src DestString x -> do addBuffer x src; return $ return () return $ sequence_ $ wait1 : waits whenJust inh $ snd $ stdIn poStdin if poAsync then return (pid, ExitSuccess) else do sequence_ wait flushBuffers res <- waitForProcess pid whenJust outh hClose whenJust errh hClose return (pid, res) --------------------------------------------------------------------- -- COMPATIBILITY -- available in bytestring-0.9.1.10 and above -- implementation copied below bs_hGetSome :: Handle -> Int -> IO BS.ByteString bs_hGetSome h i = BS.createAndTrim i $ \p -> hGetBufSome h p i shake-0.15.5/src/General/Intern.hs0000644000000000000000000000235212560222036015022 0ustar0000000000000000{-# LANGUAGE GeneralizedNewtypeDeriving, FlexibleInstances, MultiParamTypeClasses #-} module General.Intern( Intern, Id, empty, insert, add, lookup, toList, fromList ) where import General.Binary import Development.Shake.Classes import Prelude hiding (lookup) import qualified Data.HashMap.Strict as Map import Data.List(foldl') -- Invariant: The first field is the highest value in the Map data Intern a = Intern {-# UNPACK #-} !Word32 !(Map.HashMap a Id) newtype Id = Id Word32 deriving (Eq,Hashable,Binary,Show,NFData) instance BinaryWith w Id where putWith ctx = put getWith ctx = get empty :: Intern a empty = Intern 0 Map.empty insert :: (Eq a, Hashable a) => a -> Id -> Intern a -> Intern a insert k v@(Id i) (Intern n mp) = Intern (max n i) $ Map.insert k v mp add :: (Eq a, Hashable a) => a -> Intern a -> (Intern a, Id) add k (Intern v mp) = (Intern v2 $ Map.insert k (Id v2) mp, Id v2) where v2 = v + 1 lookup :: (Eq a, Hashable a) => a -> Intern a -> Maybe Id lookup k (Intern n mp) = Map.lookup k mp toList :: Intern a -> [(a, Id)] toList (Intern a b) = Map.toList b fromList :: (Eq a, Hashable a) => [(a, Id)] -> Intern a fromList xs = Intern (foldl' max 0 [i | (_, Id i) <- xs]) (Map.fromList xs) shake-0.15.5/src/General/Extra.hs0000644000000000000000000000375512560222036014656 0ustar0000000000000000{-# LANGUAGE BangPatterns, CPP #-} module General.Extra( getProcessorCount, randomElem, showQuote, withs, ) where import Control.Exception.Extra import Data.Char import Data.List import System.Environment.Extra import System.IO.Extra import System.IO.Unsafe import System.Random #if __GLASGOW_HASKELL__ >= 704 import Control.Concurrent import Foreign.C.Types #endif --------------------------------------------------------------------- -- Data.List showQuote :: String -> String showQuote xs | any isSpace xs = "\"" ++ concatMap (\x -> if x == '\"' then "\"\"" else [x]) xs ++ "\"" | otherwise = xs --------------------------------------------------------------------- -- System.Info #if __GLASGOW_HASKELL__ >= 704 -- Use the underlying GHC function foreign import ccall getNumberOfProcessors :: IO CInt #endif {-# NOINLINE getProcessorCount #-} getProcessorCount :: IO Int -- unsafePefromIO so we cache the result and only compute it once getProcessorCount = let res = unsafePerformIO act in return res where act = #if __GLASGOW_HASKELL__ >= 704 if rtsSupportsBoundThreads then fmap fromIntegral $ getNumberOfProcessors else #endif handle_ (const $ return 1) $ do env <- lookupEnv "NUMBER_OF_PROCESSORS" case env of Just s | [(i,"")] <- reads s -> return i _ -> do src <- readFile' "/proc/cpuinfo" return $! length [() | x <- lines src, "processor" `isPrefixOf` x] --------------------------------------------------------------------- -- System.Random randomElem :: [a] -> IO a randomElem xs = do i <- randomRIO (0, length xs - 1) return $ xs !! i --------------------------------------------------------------------- -- Control.Monad withs :: [(a -> r) -> r] -> ([a] -> r) -> r withs [] act = act [] withs (f:fs) act = f $ \a -> withs fs $ \as -> act $ a:as shake-0.15.5/src/General/Concurrent.hs0000644000000000000000000000201312560222036015677 0ustar0000000000000000 module General.Concurrent( Fence, newFence, signalFence, waitFence, testFence, ) where import Control.Applicative import Control.Monad import Data.IORef import Prelude --------------------------------------------------------------------- -- FENCE -- | Like a barrier, but based on callbacks newtype Fence a = Fence (IORef (Either [a -> IO ()] a)) instance Show (Fence a) where show _ = "Fence" newFence :: IO (Fence a) newFence = Fence <$> newIORef (Left []) signalFence :: Fence a -> a -> IO () signalFence (Fence ref) v = join $ atomicModifyIORef ref $ \x -> case x of Left queue -> (Right v, mapM_ ($ v) $ reverse queue) Right v -> error "Shake internal error, signalFence called twice on one Fence" waitFence :: Fence a -> (a -> IO ()) -> IO () waitFence (Fence ref) call = join $ atomicModifyIORef ref $ \x -> case x of Left queue -> (Left (call:queue), return ()) Right v -> (Right v, call v) testFence :: Fence a -> IO (Maybe a) testFence (Fence x) = either (const Nothing) Just <$> readIORef x shake-0.15.5/src/General/Cleanup.hs0000644000000000000000000000273012560222036015152 0ustar0000000000000000 -- | Code for ensuring cleanup actions are run. module General.Cleanup( Cleanup, withCleanup, addCleanup ) where import Control.Exception as E import Control.Monad import qualified Data.HashMap.Strict as Map import Data.Function import Data.IORef import Data.List data S = S {unique :: !Int, items :: Map.HashMap Int (IO ())} newtype Cleanup = Cleanup (IORef S) -- | Run with some cleanup scope. Regardless of exceptions/threads, all 'addCleanup' actions -- will be run by the time it exits. The 'addCleanup' actions will be run in reverse order. withCleanup :: (Cleanup -> IO a) -> IO a withCleanup act = do ref <- newIORef $ S 0 Map.empty act (Cleanup ref) `finally` do items <- atomicModifyIORef ref $ \s -> (s{items=Map.empty}, items s) mapM_ snd $ sortBy (compare `on` negate . fst) $ Map.toList items -- | Add a cleanup action to a 'Cleanup' scope. If the return action is not run by the time -- 'withCleanup' terminates then it will be run then. The argument 'Bool' is 'True' to say -- run the action, 'False' to say ignore the action (and never run it). addCleanup :: Cleanup -> IO () -> IO (Bool -> IO ()) addCleanup (Cleanup ref) act = atomicModifyIORef ref $ \s -> let i = unique s in (,) (S (unique s + 1) (Map.insert i act $ items s)) $ \b -> do join $ atomicModifyIORef ref $ \s -> case Map.lookup i $ items s of Nothing -> (s, return ()) Just act -> (s{items = Map.delete i $ items s}, when b act) shake-0.15.5/src/General/Binary.hs0000644000000000000000000000371212560222036015010 0ustar0000000000000000{-# LANGUAGE MultiParamTypeClasses, FlexibleInstances #-} module General.Binary( BinaryWith(..), module Data.Binary, BinList(..), BinFloat(..) ) where import Control.Applicative import Control.Monad import Data.Binary import Data.List import Foreign import System.IO.Unsafe as U class BinaryWith ctx a where putWith :: ctx -> a -> Put getWith :: ctx -> Get a instance (BinaryWith ctx a, BinaryWith ctx b) => BinaryWith ctx (a,b) where putWith ctx (a,b) = putWith ctx a >> putWith ctx b getWith ctx = liftA2 (,) (getWith ctx) (getWith ctx) instance BinaryWith ctx a => BinaryWith ctx [a] where putWith ctx xs = put (length xs) >> mapM_ (putWith ctx) xs getWith ctx = do n <- get; replicateM n $ getWith ctx instance BinaryWith ctx a => BinaryWith ctx (Maybe a) where putWith ctx Nothing = putWord8 0 putWith ctx (Just x) = putWord8 1 >> putWith ctx x getWith ctx = do i <- getWord8; if i == 0 then return Nothing else fmap Just $ getWith ctx newtype BinList a = BinList {fromBinList :: [a]} instance Show a => Show (BinList a) where show = show . fromBinList instance Binary a => Binary (BinList a) where put (BinList xs) = case splitAt 254 xs of (a, []) -> putWord8 (genericLength xs) >> mapM_ put xs (a, b) -> putWord8 255 >> mapM_ put a >> put (BinList b) get = do x <- getWord8 case x of 255 -> do xs <- replicateM 254 get; BinList ys <- get; return $ BinList $ xs ++ ys n -> fmap BinList $ replicateM (fromInteger $ toInteger n) get newtype BinFloat = BinFloat {fromBinFloat :: Float} instance Show BinFloat where show = show . fromBinFloat instance Binary BinFloat where put (BinFloat x) = put (convert x :: Word32) get = fmap (BinFloat . convert) (get :: Get Word32) -- Originally from data-binary-ieee754 package convert :: (Storable a, Storable b) => a -> b convert x = U.unsafePerformIO $ alloca $ \buf -> do poke (castPtr buf) x peek buf shake-0.15.5/src/General/Bilist.hs0000644000000000000000000000145212560222036015011 0ustar0000000000000000 module General.Bilist( Bilist, cons, snoc, uncons, toList, isEmpty ) where import Data.Monoid import Prelude data Bilist a = Bilist [a] [a] toList :: Bilist a -> [a] toList (Bilist as bs) = as ++ reverse bs isEmpty :: Bilist a -> Bool isEmpty (Bilist as bs) = null as && null bs instance Eq a => Eq (Bilist a) where a == b = toList a == toList b instance Monoid (Bilist a) where mempty = Bilist [] [] mappend a b = Bilist (toList a ++ toList b) [] cons :: a -> Bilist a -> Bilist a cons x (Bilist as bs) = Bilist (x:as) bs snoc :: Bilist a -> a -> Bilist a snoc (Bilist as bs) x = Bilist as (x:bs) uncons :: Bilist a -> Maybe (a, Bilist a) uncons (Bilist [] []) = Nothing uncons (Bilist (a:as) bs) = Just (a, Bilist as bs) uncons (Bilist [] bs) = uncons $ Bilist (reverse bs) [] shake-0.15.5/src/Development/0000755000000000000000000000000012560222036014132 5ustar0000000000000000shake-0.15.5/src/Development/Shake.hs0000644000000000000000000002112712560222036015524 0ustar0000000000000000 -- | This module is used for defining Shake build systems. As a simple example of a Shake build system, -- let us build the file @result.tar@ from the files listed by @result.txt@: -- -- @ --import "Development.Shake" --import "Development.Shake.FilePath" -- --main = 'shakeArgs' 'shakeOptions' $ do -- 'want' [\"result.tar\"] -- \"*.tar\" '%>' \\out -> do -- contents \<- 'readFileLines' $ out 'Development.Shake.FilePath.-<.>' \"txt\" -- 'need' contents -- 'cmd' \"tar -cf\" [out] contents -- @ -- -- We start by importing the modules defining both Shake and routines for manipulating 'FilePath' values. -- We define @main@ to call 'shake' with the default 'shakeOptions'. As the second argument to -- 'shake', we provide a set of rules. There are two common forms of rules, 'want' to specify target files, -- and '%>' to define a rule which builds a 'FilePattern'. We use 'want' to require that after the build -- completes the file @result.tar@ should be ready. -- -- The @*.tar@ rule describes how to build files with the extension @.tar@, including @result.tar@. -- We 'readFileLines' on @result.txt@, after changing the @.tar@ extension to @.txt@. We read each line -- into the variable @contents@ -- being a list of the files that should go into @result.tar@. Next, we -- depend ('need') all the files in @contents@. If any of these files change, the rule will be repeated. -- Finally we call the @tar@ program. If either @result.txt@ changes, or any of the files listed by @result.txt@ -- change, then @result.tar@ will be rebuilt. -- -- To find out more: -- -- * The user manual contains a longer example and background information on how to use Shake -- . -- -- * The home page has links to additional information , including -- a mailing list. -- -- * The theory behind Shake is described in an ICFP 2012 paper, /Shake Before Building -- Replacing Make with Haskell/ -- . The associated talk -- forms a short overview of Shake . -- -- /== WRITING A BUILD SYSTEM ==============================/ -- -- When writing a Shake build system, start by defining what you 'want', then write rules -- with '%>' to produce the results. Before calling 'cmd' you should ensure that any files the command -- requires are demanded with calls to 'need'. We offer the following advice to Shake users: -- -- * If @ghc --make@ or @cabal@ is capable of building your project, use that instead. Custom build systems are -- necessary for many complex projects, but many projects are not complex. -- -- * The 'shakeArgs' function automatically handles command line arguments. To define non-file targets use 'phony'. -- -- * Put all result files in a distinguished directory, for example @_make@. You can implement a @clean@ -- command by removing that directory, using @'removeFilesAfter' \"_make\" [\"\/\/\*\"]@. -- -- * To obtain parallel builds set 'shakeThreads' to a number greater than 1. -- -- * Lots of compilers produce @.o@ files. To avoid overlapping rules, use @.c.o@ for C compilers, -- @.hs.o@ for Haskell compilers etc. -- -- * Do not be afraid to mix Shake rules, system commands and other Haskell libraries -- use each for what -- it does best. -- -- * The more accurate the dependencies are, the better. Use additional rules like 'doesFileExist' and -- 'getDirectoryFiles' to track information other than just the contents of files. For information in the environment -- that you suspect will change regularly (perhaps @ghc@ version number), either write the information to -- a file with 'alwaysRerun' and 'writeFileChanged', or use 'addOracle'. -- -- /== GHC BUILD FLAGS ==============================/ -- -- For large build systems the choice of GHC flags can have a significant impact. We recommend: -- -- > ghc --make MyBuildSystem -rtsopts -with-rtsopts=-I0 -- -- * @-rtsopts@: Allow the setting of further GHC options at runtime. -- -- * @-I0@: Disable idle garbage collection. In a build system regularly running many system -- commands the program appears \"idle\" very often, triggering regular unnecessary garbage collection, stealing -- resources from the program doing actual work. -- -- * With GHC 7.6 and before, omit @-threaded@: A GHC bug 7646 -- can cause a race condition in build systems that write files then read them. Omitting @-threaded@ will -- still allow your 'cmd' actions to run in parallel, so most build systems will still run in parallel. -- -- * With GHC 7.8 and later you may add @-threaded@, and pass the options @-qg -qb@ to @-with-rtsopts@ -- to disable parallel garbage collection. Parallel garbage collection in Shake -- programs typically goes slower than sequential garbage collection, while occupying many cores that -- could be used for running system commands. -- -- /Acknowledgements/: Thanks to Austin Seipp for properly integrating the profiling code. module Development.Shake( -- * Core shake, shakeOptions, Rules, action, withoutActions, alternatives, priority, Action, traced, liftIO, actionOnException, actionFinally, ShakeException(..), -- * Configuration ShakeOptions(..), Assume(..), Lint(..), Change(..), getShakeOptions, getHashedShakeVersion, -- ** Command line shakeArgs, shakeArgsWith, shakeOptDescrs, -- ** Progress reporting Progress(..), progressSimple, progressDisplay, progressTitlebar, progressProgram, -- ** Verbosity Verbosity(..), getVerbosity, putLoud, putNormal, putQuiet, withVerbosity, quietly, -- * Running commands command, command_, cmd, unit, Stdout(..), Stderr(..), Stdouterr(..), Exit(..), CmdTime(..), CmdLine(..), CmdResult, CmdString, CmdOption(..), addPath, addEnv, -- * Utility functions copyFile', copyFileChanged, readFile', readFileLines, writeFile', writeFileLines, writeFileChanged, removeFiles, removeFilesAfter, withTempFile, withTempDir, -- * File rules need, want, (%>), (|%>), (?>), phony, (~>), phonys, (&%>), (&?>), orderOnly, FilePattern, (?==), (), needed, trackRead, trackWrite, trackAllow, -- * Directory rules doesFileExist, doesDirectoryExist, getDirectoryContents, getDirectoryFiles, getDirectoryDirs, -- * Environment rules getEnv, getEnvWithDefault, -- * Oracle rules addOracle, askOracle, askOracleWith, -- * Special rules alwaysRerun, -- * Resources Resource, newResource, newResourceIO, withResource, withResources, newThrottle, newThrottleIO, unsafeExtraThread, -- * Cache newCache, newCacheIO, -- * Deprecated (*>), (|*>), (&*>), (**>), (*>>), (?>>), system', systemCwd, systemOutput ) where import Prelude(Maybe, FilePath) -- Since GHC 7.10 duplicates *> -- I would love to use module export in the above export list, but alas Haddock -- then shows all the things that are hidden in the docs, which is terrible. import Control.Monad.IO.Class import Development.Shake.Types import Development.Shake.Core hiding (trackAllow) import Development.Shake.Derived import Development.Shake.Errors import Development.Shake.Progress import Development.Shake.Args import Development.Shake.Shake import Development.Shake.Command import Development.Shake.Rules.Directory import Development.Shake.Rules.File import Development.Shake.FilePattern import Development.Shake.Rules.Files import Development.Shake.Rules.Oracle import Development.Shake.Rules.OrderOnly import Development.Shake.Rules.Rerun --------------------------------------------------------------------- -- DEPRECATED SINCE 0.13, MAY 2014 infix 1 **>, ?>>, *>> -- | /Deprecated:/ Alias for '|%>'. (**>) :: [FilePattern] -> (FilePath -> Action ()) -> Rules () (**>) = (|%>) -- | /Deprecated:/ Alias for '&?>'. (?>>) :: (FilePath -> Maybe [FilePath]) -> ([FilePath] -> Action ()) -> Rules () (?>>) = (&?>) -- | /Deprecated:/ Alias for '&%>'. (*>>) :: [FilePattern] -> ([FilePath] -> Action ()) -> Rules () (*>>) = (&%>) --------------------------------------------------------------------- -- DEPRECATED SINCE 0.14, MAY 2014 infix 1 *>, |*>, &*> -- | /Deprecated:/ Alias for '%>'. Note that @*>@ clashes with a Prelude operator in GHC 7.10. (*>) :: FilePattern -> (FilePath -> Action ()) -> Rules () (*>) = (%>) -- | /Deprecated:/ Alias for '|%>'. (|*>) :: [FilePattern] -> (FilePath -> Action ()) -> Rules () (|*>) = (|%>) -- | /Deprecated:/ Alias for '&%>'. (&*>) :: [FilePattern] -> ([FilePath] -> Action ()) -> Rules () (&*>) = (&%>) shake-0.15.5/src/Development/Shake/0000755000000000000000000000000012560222036015165 5ustar0000000000000000shake-0.15.5/src/Development/Shake/Value.hs0000644000000000000000000001171312560222036016600 0ustar0000000000000000{-# LANGUAGE ExistentialQuantification, GeneralizedNewtypeDeriving, MultiParamTypeClasses #-} {- | This module implements the Key/Value types, to abstract over hetrogenous data types. -} module Development.Shake.Value( Value, newValue, fromValue, typeValue, Key, newKey, fromKey, typeKey, Witness, currentWitness, registerWitness ) where import General.Binary import Development.Shake.Classes import Development.Shake.Errors import Data.Typeable import Data.Bits import Data.Function import Data.IORef import Data.List import Data.Maybe import qualified Data.HashMap.Strict as Map import qualified Data.ByteString.Char8 as BS import System.IO.Unsafe -- We deliberately avoid Typeable instances on Key/Value to stop them accidentally -- being used inside themselves newtype Key = Key Value deriving (Eq,Hashable,NFData,BinaryWith Witness) data Value = forall a . (Eq a, Show a, Typeable a, Hashable a, Binary a, NFData a) => Value a newKey :: (Eq a, Show a, Typeable a, Hashable a, Binary a, NFData a) => a -> Key newKey = Key . newValue newValue :: (Eq a, Show a, Typeable a, Hashable a, Binary a, NFData a) => a -> Value newValue = Value typeKey :: Key -> TypeRep typeKey (Key v) = typeValue v typeValue :: Value -> TypeRep typeValue (Value x) = typeOf x fromKey :: Typeable a => Key -> a fromKey (Key v) = fromValue v fromValue :: Typeable a => Value -> a fromValue (Value x) = fromMaybe (err "fromValue, bad cast") $ cast x instance Show Key where show (Key a) = show a instance Show Value where show (Value a) = show a instance NFData Value where rnf (Value a) = rnf a instance Hashable Value where hashWithSalt salt (Value a) = hashWithSalt salt (typeOf a) `xor` hashWithSalt salt a instance Eq Value where Value a == Value b = maybe False (a ==) $ cast b Value a /= Value b = maybe True (a /=) $ cast b --------------------------------------------------------------------- -- BINARY INSTANCES {-# NOINLINE witness #-} witness :: IORef (Map.HashMap TypeRep Value) witness = unsafePerformIO $ newIORef Map.empty registerWitness :: (Eq a, Show a, Typeable a, Hashable a, Binary a, NFData a) => a -> IO () registerWitness x = atomicModifyIORef witness $ \mp -> (Map.insert (typeOf x) (Value $ err msg `asTypeOf` x) mp, ()) where msg = "registerWitness, type " ++ show (typeOf x) -- Produce a list in a predictable order from a Map TypeRep, which should be consistent regardless of the order -- elements were added and stable between program executions. -- Cannot rely on hash (not pure in hashable-1.2) or compare (not available before 7.2) toStableList :: Map.HashMap TypeRep v -> [(TypeRep,v)] toStableList = sortBy (compare `on` show . fst) . Map.toList data Witness = Witness {typeNames :: [String] -- the canonical data, the names of the types ,witnessIn :: Map.HashMap Word16 Value -- for reading in, the find the values (some may be missing) ,witnessOut :: Map.HashMap TypeRep Word16 -- for writing out, find the value } deriving Show instance Eq Witness where -- Type names are produced by toStableList so should to remain consistent -- regardless of the order of registerWitness calls. a == b = typeNames a == typeNames b currentWitness :: IO Witness currentWitness = do ws <- readIORef witness let (ks,vs) = unzip $ toStableList ws return $ Witness (map show ks) (Map.fromList $ zip [0..] vs) (Map.fromList $ zip ks [0..]) instance Binary Witness where put (Witness ts _ _) = put $ BS.unlines $ map BS.pack ts get = do ts <- fmap (map BS.unpack . BS.lines) get let ws = toStableList $ unsafePerformIO $ readIORefAfter ts witness let (is,ks,vs) = unzip3 [(i,k,v) | (i,t) <- zip [0..] ts, (k,v):_ <- [filter ((==) t . show . fst) ws]] return $ Witness ts (Map.fromList $ zip is vs) (Map.fromList $ zip ks is) where -- Read an IORef after examining a variable, used to avoid GHC over-optimisation {-# NOINLINE readIORefAfter #-} readIORefAfter :: a -> IORef b -> IO b readIORefAfter v ref = v `seq` readIORef ref instance BinaryWith Witness Value where putWith ws (Value x) = do let msg = "no witness for " ++ show (typeOf x) put $ fromMaybe (error msg) $ Map.lookup (typeOf x) (witnessOut ws) put x getWith ws = do h <- get case Map.lookup h $ witnessIn ws of Nothing | h >= 0 && h < genericLength (typeNames ws) -> error $ "Failed to find a type " ++ (typeNames ws !! fromIntegral h) ++ " which is stored in the database.\n" ++ "The most likely cause is that your build tool has changed significantly." Nothing -> error $ -- should not happen, unless proper data corruption "Corruption when reading Value, got type " ++ show h ++ ", but should be in range 0.." ++ show (length (typeNames ws) - 1) Just (Value t) -> do x <- get return $ Value $ x `asTypeOf` t shake-0.15.5/src/Development/Shake/Util.hs0000644000000000000000000001107312560222036016440 0ustar0000000000000000 -- | A module for useful utility functions for Shake build systems. module Development.Shake.Util( parseMakefile, needMakefileDependencies, neededMakefileDependencies, shakeArgsAccumulate, shakeArgsPrune, shakeArgsPruneWith, ) where import Development.Shake import Development.Shake.Rules.File import qualified Data.ByteString.Char8 as BS import qualified Development.Shake.ByteString as BS import Data.Tuple.Extra import Control.Applicative import Data.List import System.Console.GetOpt import Data.IORef import Data.Maybe import Control.Monad.Extra import Prelude import System.IO.Extra as IO -- | Given the text of a Makefile, extract the list of targets and dependencies. Assumes a -- small subset of Makefile syntax, mostly that generated by @gcc -MM@. -- -- > parseMakefile "a: b c\nd : e" == [("a",["b","c"]),("d",["e"])] parseMakefile :: String -> [(FilePath, [FilePath])] parseMakefile = map (BS.unpack *** map BS.unpack) . BS.parseMakefile . BS.pack -- | Depend on the dependencies listed in a Makefile. Does not depend on the Makefile itself. -- -- > needMakefileDependencies file = need . concatMap snd . parseMakefile =<< liftIO (readFile file) needMakefileDependencies :: FilePath -> Action () needMakefileDependencies file = needBS . concatMap snd . BS.parseMakefile =<< liftIO (BS.readFile file) -- | Depend on the dependencies listed in a Makefile. Does not depend on the Makefile itself. -- Use this function to indicate that you have /already/ used the files in question. -- -- > neededMakefileDependencies file = needed . concatMap snd . parseMakefile =<< liftIO (readFile file) neededMakefileDependencies :: FilePath -> Action () neededMakefileDependencies file = neededBS . concatMap snd . BS.parseMakefile =<< liftIO (BS.readFile file) -- | Like `shakeArgsWith`, but instead of accumulating a list of flags, apply functions to a default value. -- Usually used to populate a record structure. As an example of a build system that can use either @gcc@ or @distcc@ for compiling: -- -- @ --import System.Console.GetOpt -- --data Flags = Flags {distCC :: Bool} deriving Eq --flags = [Option \"\" [\"distcc\"] (NoArg $ Right $ \\x -> x{distCC=True}) \"Run distributed.\"] -- --main = 'shakeArgsAccumulate' 'shakeOptions' flags (Flags False) $ \\flags targets -> return $ Just $ do -- if null targets then 'want' [\"result.exe\"] else 'want' targets -- let compiler = if distCC flags then \"distcc\" else \"gcc\" -- \"*.o\" '%>' \\out -> do -- 'need' ... -- 'cmd' compiler ... -- ... -- @ -- -- Now you can pass @--distcc@ to use the @distcc@ compiler. shakeArgsAccumulate :: ShakeOptions -> [OptDescr (Either String (a -> a))] -> a -> (a -> [String] -> IO (Maybe (Rules ()))) -> IO () shakeArgsAccumulate opts flags def f = shakeArgsWith opts flags $ \flags targets -> f (foldl' (flip ($)) def flags) targets -- | Like 'shakeArgs' but also takes a pruning function. If @--prune@ is passed, then after the build has completed, -- the second argument is called with a list of the files that the build checked were up-to-date. shakeArgsPrune :: ShakeOptions -> ([FilePath] -> IO ()) -> Rules () -> IO () shakeArgsPrune opts prune rules = shakeArgsPruneWith opts prune [] f where f _ files = return $ Just $ if null files then rules else want files >> withoutActions rules -- | A version of 'shakeArgsPrune' that also takes a list of extra options to use. shakeArgsPruneWith :: ShakeOptions -> ([FilePath] -> IO ()) -> [OptDescr (Either String a)] -> ([a] -> [String] -> IO (Maybe (Rules ()))) -> IO () shakeArgsPruneWith opts prune flags act = do let flags2 = Option "P" ["prune"] (NoArg $ Right Nothing) "Remove stale files" : map (fmapOptDescr $ fmap Just) flags pruning <- newIORef False shakeArgsWith opts flags2 $ \opts args -> if any isNothing opts then do writeIORef pruning True return Nothing else act (map fromJust opts) args whenM (readIORef pruning) $ do IO.withTempFile $ \file -> do shakeArgsWith opts{shakeLiveFiles=file : shakeLiveFiles opts} flags2 $ \opts args -> act (catMaybes opts) args src <- lines <$> IO.readFile' file prune src -- fmap is only an instance in later GHC versions, so fake our own version fmapOptDescr :: (a -> b) -> OptDescr a -> OptDescr b fmapOptDescr f (Option a b argDescr c) = Option a b (fmapArgDescr f argDescr) c fmapArgDescr :: (a -> b) -> ArgDescr a -> ArgDescr b fmapArgDescr f (NoArg a) = NoArg (f a) fmapArgDescr f (ReqArg g s) = ReqArg (f . g) s fmapArgDescr f (OptArg g s) = OptArg (f . g) s shake-0.15.5/src/Development/Shake/Types.hs0000644000000000000000000003361512560222036016635 0ustar0000000000000000{-# LANGUAGE DeriveDataTypeable, PatternGuards #-} -- | Types exposed to the user module Development.Shake.Types( Progress(..), Verbosity(..), Assume(..), Lint(..), Change(..), EqualCost(..), ShakeOptions(..), shakeOptions ) where import Data.Data import Data.List import Development.Shake.Progress import qualified Data.ByteString.Char8 as BS -- | The current assumptions made by the build system, used by 'shakeAssume'. These options -- allow the end user to specify that any rules run are either to be treated as clean, or as -- dirty, regardless of what the build system thinks. -- -- These assumptions only operate on files reached by the current 'Development.Shake.action' commands. Any -- other files in the database are left unchanged. data Assume = AssumeDirty -- ^ Assume that all rules reached are dirty and require rebuilding, equivalent to 'Development.Shake.Rule.storedValue' always -- returning 'Nothing'. Useful to undo the results of 'AssumeClean', for benchmarking rebuild speed and -- for rebuilding if untracked dependencies have changed. This assumption is safe, but may cause -- more rebuilding than necessary. | AssumeClean -- ^ /This assumption is unsafe, and may lead to incorrect build results in this run, and in future runs/. -- Assume and record that all rules reached are clean and do not require rebuilding, provided the rule -- has a 'Development.Shake.Rule.storedValue' and has been built before. Useful if you have modified a file in some -- inconsequential way, such as only the comments or whitespace, and wish to avoid a rebuild. | AssumeSkip -- ^ /This assumption is unsafe, and may lead to incorrect build results in this run/. -- Assume that all rules reached are clean in this run. Only useful for benchmarking, to remove any overhead -- from running 'Development.Shake.Rule.storedValue' operations. deriving (Eq,Ord,Show,Data,Typeable,Bounded,Enum) -- | Which lint checks to perform, used by 'shakeLint'. data Lint = LintBasic -- ^ The most basic form of linting. Checks that the current directory does not change and that results do not change after they -- are first written. Any calls to 'needed' will assert that they do not cause a rule to be rebuilt. | LintTracker -- ^ Track which files are accessed by command line programs run by 'command' or 'cmd', using @tracker.exe@ as supplied -- with the Microsoft .NET 4.5 SDK (Windows only). Also performs all checks from 'LintBasic'. Note that some programs are not -- tracked properly, particularly cygwin programs (it seems). deriving (Eq,Ord,Show,Data,Typeable,Bounded,Enum) -- | How should you determine if a file has changed, used by 'shakeChange'. The most common values are -- 'ChangeModtime' (very fast, @touch@ causes files to rebuild) and 'ChangeModtimeAndDigestInput' -- (a bit slower, @touch@ does not cause input files to rebuild). data Change = ChangeModtime -- ^ Compare equality of modification timestamps, a file has changed if its last modified time changes. -- A @touch@ will force a rebuild. This mode is fast and usually sufficiently accurate, so is the default. | ChangeDigest -- ^ Compare equality of file contents digests, a file has changed if its digest changes. -- A @touch@ will not force a rebuild. Use this mode if modification times on your file system are unreliable. | ChangeModtimeAndDigest -- ^ A file is rebuilt if both its modification time and digest have changed. For efficiency reasons, the modification -- time is checked first, and if that has changed, the digest is checked. | ChangeModtimeAndDigestInput -- ^ Use 'ChangeModtimeAndDigest' for input\/source files and 'ChangeModtime' for output files. | ChangeModtimeOrDigest -- ^ A file is rebuilt if either its modification time or its digest has changed. A @touch@ will force a rebuild, -- but even if a files modification time is reset afterwards, changes will also cause a rebuild. deriving (Eq,Ord,Show,Data,Typeable,Bounded,Enum) -- | Options to control the execution of Shake, usually specified by overriding fields in -- 'shakeOptions': -- -- @ 'shakeOptions'{'shakeThreads'=4, 'shakeReport'=[\"report.html\"]} @ -- -- The 'Data' instance for this type reports the 'shakeProgress' and 'shakeOutput' fields as having the abstract type 'Function', -- because 'Data' cannot be defined for functions. data ShakeOptions = ShakeOptions {shakeFiles :: FilePath -- ^ Defaults to @.shake@. The directory used for storing Shake metadata files. -- All metadata files will be named @'shakeFiles'\/.shake./file-name/@, for some @/file-name/@. -- If the 'shakeFiles' directory does not exist it will be created. ,shakeThreads :: Int -- ^ Defaults to @1@. Maximum number of rules to run in parallel, similar to @make --jobs=/N/@. -- For many build systems, a number equal to or slightly less than the number of physical processors -- works well. Use @0@ to match the detected number of processors (when @0@, 'getShakeOptions' will -- return the number of threads used). ,shakeVersion :: String -- ^ Defaults to @"1"@. The version number of your build rules. -- Change the version number to force a complete rebuild, such as when making -- significant changes to the rules that require a wipe. The version number should be -- set in the source code, and not passed on the command line. ,shakeVerbosity :: Verbosity -- ^ Defaults to 'Normal'. What level of messages should be printed out. ,shakeStaunch :: Bool -- ^ Defaults to 'False'. Operate in staunch mode, where building continues even after errors, -- similar to @make --keep-going@. ,shakeReport :: [FilePath] -- ^ Defaults to @[]@. Write a profiling report to a file, showing which rules rebuilt, -- why, and how much time they took. Useful for improving the speed of your build systems. -- If the file extension is @.json@ it will write JSON data; if @.js@ it will write Javascript; -- if @.trace@ it will write trace events (load into @about:\/\/tracing@ in Chrome); -- otherwise it will write HTML. ,shakeLint :: Maybe Lint -- ^ Defaults to 'Nothing'. Perform sanity checks during building, see 'Lint' for details. ,shakeFlush :: Maybe Double -- ^ Defaults to @'Just' 10@. How often to flush Shake metadata files in seconds, or 'Nothing' to never flush explicitly. -- It is possible that on abnormal termination (not Haskell exceptions) any rules that completed in the last -- 'shakeFlush' seconds will be lost. ,shakeAssume :: Maybe Assume -- ^ Defaults to 'Nothing'. Assume all build objects are clean/dirty, see 'Assume' for details. -- Can be used to implement @make --touch@. ,shakeAbbreviations :: [(String,String)] -- ^ Defaults to @[]@. A list of substrings that should be abbreviated in status messages, and their corresponding abbreviation. -- Commonly used to replace the long paths (e.g. @.make\/i586-linux-gcc\/output@) with an abbreviation (e.g. @$OUT@). ,shakeStorageLog :: Bool -- ^ Defaults to 'False'. Write a message to @'shakeFiles'\/.shake.storage.log@ whenever a storage event happens which may impact -- on the current stored progress. Examples include database version number changes, database compaction or corrupt files. ,shakeLineBuffering :: Bool -- ^ Defaults to 'True'. Change 'stdout' and 'stderr' to line buffering while running Shake. ,shakeTimings :: Bool -- ^ Defaults to 'False'. Print timing information for each stage at the end. ,shakeRunCommands :: Bool -- ^ Default to 'True'. Should you run command line actions, set to 'False' to skip actions whose output streams and exit code -- are not used. Useful for profiling the non-command portion of the build system. ,shakeChange :: Change -- ^ Default to 'ChangeModtime'. How to check if a file has changed, see 'Change' for details. ,shakeCreationCheck :: Bool -- ^ Default to 'True'. After running a rule to create a file, is it an error if the file does not exist. -- Provided for compatibility with @make@ and @ninja@ (which have ugly file creation semantics). -- ,shakeOutputCheck :: Bool -- -- ^ Default to 'True'. If a file produced by a rule changes, should you rebuild it. ,shakeLiveFiles :: [FilePath] -- ^ Default to '[]'. After the build system completes, write a list of all files which were /live/ in that run, -- i.e. those which Shake checked were valid or rebuilt. Produces best answers if nothing rebuilds. ,shakeVersionIgnore :: Bool -- ^ Defaults to 'False'. Ignore any differences in 'shakeVersion'. ,shakeProgress :: IO Progress -> IO () -- ^ Defaults to no action. A function called when the build starts, allowing progress to be reported. -- The function is called on a separate thread, and that thread is killed when the build completes. -- For applications that want to display progress messages, 'progressSimple' is often sufficient, but more advanced -- users should look at the 'Progress' data type. ,shakeOutput :: Verbosity -> String -> IO () -- ^ Defaults to writing using 'putStrLn'. A function called to output messages from Shake, along with the 'Verbosity' at -- which that message should be printed. This function will be called atomically from all other 'shakeOutput' functions. -- The 'Verbosity' will always be greater than or higher than 'shakeVerbosity'. } deriving Typeable -- | The default set of 'ShakeOptions'. shakeOptions :: ShakeOptions shakeOptions = ShakeOptions ".shake" 1 "1" Normal False [] Nothing (Just 10) Nothing [] False True False True ChangeModtime True [] False (const $ return ()) (const $ BS.putStrLn . BS.pack) -- try and output atomically using BS fieldsShakeOptions = ["shakeFiles", "shakeThreads", "shakeVersion", "shakeVerbosity", "shakeStaunch", "shakeReport" ,"shakeLint", "shakeFlush", "shakeAssume", "shakeAbbreviations", "shakeStorageLog" ,"shakeLineBuffering", "shakeTimings", "shakeRunCommands", "shakeChange", "shakeCreationCheck" ,"shakeLiveFiles","shakeVersionIgnore","shakeProgress", "shakeOutput"] tyShakeOptions = mkDataType "Development.Shake.Types.ShakeOptions" [conShakeOptions] conShakeOptions = mkConstr tyShakeOptions "ShakeOptions" fieldsShakeOptions Prefix unhide x1 x2 x3 x4 x5 x6 x7 x8 x9 x10 x11 x12 x13 x14 x15 x16 x17 x18 y1 y2 = ShakeOptions x1 x2 x3 x4 x5 x6 x7 x8 x9 x10 x11 x12 x13 x14 x15 x16 x17 x18 (fromFunction y1) (fromFunction y2) instance Data ShakeOptions where gfoldl k z (ShakeOptions x1 x2 x3 x4 x5 x6 x7 x8 x9 x10 x11 x12 x13 x14 x15 x16 x17 x18 y1 y2) = z unhide `k` x1 `k` x2 `k` x3 `k` x4 `k` x5 `k` x6 `k` x7 `k` x8 `k` x9 `k` x10 `k` x11 `k` x12 `k` x13 `k` x14 `k` x15 `k` x16 `k` x17 `k` x18 `k` Function y1 `k` Function y2 gunfold k z c = k $ k $ k $ k $ k $ k $ k $ k $ k $ k $ k $ k $ k $ k $ k $ k $ k $ k $ k $ k $ z unhide toConstr ShakeOptions{} = conShakeOptions dataTypeOf _ = tyShakeOptions instance Show ShakeOptions where show x = "ShakeOptions {" ++ intercalate ", " inner ++ "}" where inner = zipWith (\x y -> x ++ " = " ++ y) fieldsShakeOptions $ gmapQ f x f x | Just x <- cast x = show (x :: Int) | Just x <- cast x = show (x :: FilePath) | Just x <- cast x = show (x :: Verbosity) | Just x <- cast x = show (x :: Change) | Just x <- cast x = show (x :: Bool) | Just x <- cast x = show (x :: [FilePath]) | Just x <- cast x = show (x :: Maybe Assume) | Just x <- cast x = show (x :: Maybe Lint) | Just x <- cast x = show (x :: Maybe Double) | Just x <- cast x = show (x :: [(String,String)]) | Just x <- cast x = show (x :: Function (IO Progress -> IO ())) | Just x <- cast x = show (x :: Function (Verbosity -> String -> IO ())) | otherwise = error $ "Error while showing ShakeOptions, missing alternative for " ++ show (typeOf x) -- | Internal type, copied from Hide in Uniplate newtype Function a = Function {fromFunction :: a} deriving Typeable instance Show (Function a) where show _ = "" instance Typeable a => Data (Function a) where gfoldl k z x = z x gunfold k z c = error "Development.Shake.Types.ShakeProgress: gunfold not implemented - data type has no constructors" toConstr _ = error "Development.Shake.Types.ShakeProgress: toConstr not implemented - data type has no constructors" dataTypeOf _ = tyFunction tyFunction = mkDataType "Development.Shake.Types.Function" [] -- | The verbosity data type, used by 'shakeVerbosity'. data Verbosity = Silent -- ^ Don't print any messages. | Quiet -- ^ Only print essential messages, typically errors. | Normal -- ^ Print errors and @# /command-name/ (for /file-name/)@ when running a 'Development.Shake.traced' command. | Loud -- ^ Print errors and full command lines when running a 'Development.Shake.command' or 'Development.Shake.cmd' command. | Chatty -- ^ Print errors, full command line and status messages when starting a rule. | Diagnostic -- ^ Print messages for virtually everything (mostly for debugging). deriving (Eq,Ord,Bounded,Enum,Show,Read,Typeable,Data) -- | An equality check and a cost. data EqualCost = EqualCheap -- ^ The equality check was cheap. | EqualExpensive -- ^ The equality check was expensive, as the results are not trivially equal. | NotEqual -- ^ The values are not equal. deriving (Eq,Ord,Bounded,Enum,Show,Read,Typeable,Data) shake-0.15.5/src/Development/Shake/Storage.hs0000644000000000000000000002652312560222036017135 0ustar0000000000000000{-# LANGUAGE ScopedTypeVariables, PatternGuards, NamedFieldPuns, FlexibleInstances, MultiParamTypeClasses #-} {- This module stores the meta-data so its very important its always accurate We can't rely on getting any exceptions or termination at the end, so we'd better write out a journal We store a series of records, and if they contain twice as many records as needed, we compress -} module Development.Shake.Storage( withStorage ) where import General.Binary import Development.Shake.Types import General.Timing import Data.Tuple.Extra import Control.Exception.Extra import Control.Monad.Extra import Control.Concurrent.Extra import Data.Binary.Get import Data.Binary.Put import Data.Time import Data.Char import Development.Shake.Classes import qualified Data.HashMap.Strict as Map import Data.List import Numeric import System.Directory import System.Exit import System.FilePath import System.IO import qualified Data.ByteString.Lazy.Char8 as LBS import qualified Data.ByteString.Lazy as LBS8 type Map = Map.HashMap -- Increment every time the on-disk format/semantics change, -- @x@ is for the users version number databaseVersion :: String -> String -- THINGS I WANT TO DO ON THE NEXT CHANGE -- * Change filepaths to store a 1 byte prefix saying 8bit ASCII or UTF8 -- * Duration and Time should be stored as number of 1/10000th seconds Int32 databaseVersion x = "SHAKE-DATABASE-11-" ++ s ++ "\r\n" where s = tail $ init $ show x -- call show, then take off the leading/trailing quotes -- ensures we do not get \r or \n in the user portion -- Split the version off a file splitVersion :: LBS.ByteString -> (LBS.ByteString, LBS.ByteString) splitVersion abc = (a `LBS.append` b, c) where (a,bc) = LBS.break (== '\r') abc (b,c) = LBS.splitAt 2 bc withStorage :: (Show k, Show v, Eq w, Eq k, Hashable k ,Binary w, BinaryWith w k, BinaryWith w v) => ShakeOptions -- ^ Storage options -> (String -> IO ()) -- ^ Logging function -> w -- ^ Witness -> (Map k v -> (k -> v -> IO ()) -> IO a) -- ^ Execute -> IO a withStorage ShakeOptions{shakeVerbosity,shakeOutput,shakeVersion,shakeVersionIgnore,shakeFlush,shakeFiles,shakeStorageLog} diagnostic witness act = do let dbfile = shakeFiles ".shake.database" bupfile = shakeFiles ".shake.backup" createDirectoryIfMissing True shakeFiles -- complete a partially failed compress b <- doesFileExist bupfile when b $ do unexpected "Backup file exists, restoring over the previous file\n" diagnostic $ "Backup file move to original" ignore $ removeFile dbfile renameFile bupfile dbfile addTiming "Database read" withBinaryFile dbfile ReadWriteMode $ \h -> do n <- hFileSize h diagnostic $ "Reading file of size " ++ show n (oldVer,src) <- fmap splitVersion $ LBS.hGet h $ fromInteger n verEqual <- evaluate $ ver == oldVer -- force it so we don't leak the bytestring if not verEqual && not shakeVersionIgnore then do unless (n == 0) $ do let limit x = let (a,b) = splitAt 200 x in a ++ (if null b then "" else "...") let disp = map (\x -> if isPrint x && isAscii x then x else '?') . takeWhile (`notElem` "\r\n") outputErr $ unlines ["Error when reading Shake database - invalid version stamp detected:" ," File: " ++ dbfile ," Expected: " ++ disp (LBS.unpack ver) ," Found: " ++ disp (limit $ LBS.unpack oldVer) ,"All rules will be rebuilt"] continue h Map.empty else -- make sure you are not handling exceptions from inside join $ handleJust (\e -> if asyncException e then Nothing else Just e) (\err -> do msg <- showException err outputErr $ unlines $ ("Error when reading Shake database " ++ dbfile) : map (" "++) (lines msg) ++ ["All files will be rebuilt"] when shakeStorageLog $ do hSeek h AbsoluteSeek 0 i <- hFileSize h bs <- LBS.hGet h $ fromInteger i let cor = shakeFiles ".shake.corrupt" LBS.writeFile cor bs unexpected $ "Backup of corrupted file stored at " ++ cor ++ ", " ++ show i ++ " bytes\n" -- exitFailure -- should never happen without external corruption -- add back to check during random testing return $ continue h Map.empty) $ case readChunks src of ([], slop) -> do when (LBS.length slop > 0) $ unexpected $ "Last " ++ show slop ++ " bytes do not form a whole record\n" diagnostic $ "Read 0 chunks, plus " ++ show slop ++ " slop" return $ continue h Map.empty (w:xs, slopRaw) -> do let slop = fromIntegral $ LBS.length slopRaw when (slop > 0) $ unexpected $ "Last " ++ show slop ++ " bytes do not form a whole record\n" diagnostic $ "Read " ++ show (length xs + 1) ++ " chunks, plus " ++ show slop ++ " slop" let ws = decode w f mp (k, v) = Map.insert k v mp ents = map (runGet $ getWith ws) xs mp = foldl' f Map.empty ents when (shakeVerbosity == Diagnostic) $ do let raw x = "[len " ++ show (LBS.length x) ++ "] " ++ concat [['0' | length c == 1] ++ c | x <- LBS8.unpack x, let c = showHex x ""] let pretty (Left x) = "FAILURE: " ++ show x pretty (Right x) = x diagnostic $ "Witnesses " ++ raw w forM_ (zip3 [1..] xs ents) $ \(i,x,ent) -> do x2 <- try_ $ evaluate $ let s = show ent in rnf s `seq` s diagnostic $ "Chunk " ++ show i ++ " " ++ raw x ++ " " ++ pretty x2 diagnostic $ "Slop " ++ raw slopRaw diagnostic $ "Found " ++ show (Map.size mp) ++ " real entries" -- if mp is null, continue will reset it, so no need to clean up if verEqual && (Map.null mp || (ws == witness && Map.size mp * 2 > length xs - 2)) then do -- make sure we reset to before the slop when (not (Map.null mp) && slop /= 0) $ do diagnostic $ "Dropping last " ++ show slop ++ " bytes of database (incomplete)" now <- hFileSize h hSetFileSize h $ now - slop hSeek h AbsoluteSeek $ now - slop hFlush h diagnostic $ "Drop complete" return $ continue h mp else do addTiming "Database compression" unexpected "Compressing database\n" diagnostic "Compressing database" hClose h -- two hClose are fine return $ do renameFile dbfile bupfile withBinaryFile dbfile ReadWriteMode $ \h -> do reset h mp removeFile bupfile diagnostic "Compression complete" continue h mp where unexpected x = when shakeStorageLog $ do t <- getCurrentTime appendFile (shakeFiles ".shake.storage.log") $ "\n[" ++ show t ++ "]: " ++ x outputErr x = do when (shakeVerbosity >= Quiet) $ shakeOutput Quiet x unexpected x ver = LBS.pack $ databaseVersion shakeVersion writeChunk h s = do diagnostic $ "Writing chunk " ++ show (LBS.length s) LBS.hPut h $ toChunk s reset h mp = do diagnostic $ "Resetting database to " ++ show (Map.size mp) ++ " elements" hSetFileSize h 0 hSeek h AbsoluteSeek 0 LBS.hPut h ver writeChunk h $ encode witness mapM_ (writeChunk h . runPut . putWith witness) $ Map.toList mp hFlush h diagnostic "Flush" -- continuation (since if we do a compress, h changes) continue h mp = do when (Map.null mp) $ reset h mp -- might as well, no data to lose, and need to ensure a good witness table -- also lets us recover in the case of corruption flushThread shakeFlush h $ \out -> do addTiming "With database" act mp $ \k v -> out $ toChunk $ runPut $ putWith witness (k, v) -- We avoid calling flush too often on SSD drives, as that can be slow -- Make sure all exceptions happen on the caller, so we don't have to move exceptions back -- Make sure we only write on one thread, otherwise async exceptions can cause partial writes flushThread :: Maybe Double -> Handle -> ((LBS.ByteString -> IO ()) -> IO a) -> IO a flushThread flush h act = do chan <- newChan -- operations to perform on the file kick <- newEmptyMVar -- kicked whenever something is written died <- newBarrier -- has the writing thread finished flusher <- case flush of Nothing -> return Nothing Just flush -> fmap Just $ forkIO $ forever $ do takeMVar kick threadDelay $ ceiling $ flush * 1000000 tryTakeMVar kick writeChan chan $ hFlush h >> return True root <- myThreadId writer <- forkIO $ handle_ (\e -> signalBarrier died () >> throwTo root e) $ -- only one thread ever writes, ensuring only the final write can be torn whileM $ join $ readChan chan (act $ \s -> do evaluate $ LBS.length s -- ensure exceptions occur on this thread writeChan chan $ LBS.hPut h s >> tryPutMVar kick () >> return True) `finally` do maybe (return ()) killThread flusher writeChan chan $ signalBarrier died () >> return False waitBarrier died -- Return the amount of junk at the end, along with all the chunk readChunks :: LBS.ByteString -> ([LBS.ByteString], LBS.ByteString) readChunks x | Just (n, x) <- grab 4 x , Just (y, x) <- grab (fromIntegral (decode n :: Word32)) x = first (y :) $ readChunks x | otherwise = ([], x) where grab i x | LBS.length a == i = Just (a, b) | otherwise = Nothing where (a,b) = LBS.splitAt i x toChunk :: LBS.ByteString -> LBS.ByteString toChunk x = n `LBS.append` x where n = encode (fromIntegral $ LBS.length x :: Word32) -- | Is the exception asyncronous, not a "coding error" that should be ignored asyncException :: SomeException -> Bool asyncException e | Just (_ :: AsyncException) <- fromException e = True | Just (_ :: ExitCode) <- fromException e = True | otherwise = False shake-0.15.5/src/Development/Shake/Special.hs0000644000000000000000000000115312560222036017101 0ustar0000000000000000 -- | This module contains rule types that have special behaviour in some way. -- Everything in this module is a hack. module Development.Shake.Special( specialAlwaysRebuilds, specialIsFileKey ) where import Development.Shake.Value import Data.Typeable specialAlwaysRebuilds :: Value -> Bool specialAlwaysRebuilds v = con `elem` ["AlwaysRerunA","OracleA"] || (con == "FileA" && show v == "File {mod=NEQ,size=NEQ,digest=NEQ}") where con = show $ fst $ splitTyConApp $ typeValue v specialIsFileKey :: TypeRep -> Bool specialIsFileKey t = con == "FileQ" where con = show $ fst $ splitTyConApp t shake-0.15.5/src/Development/Shake/Shake.hs0000644000000000000000000000156412560222036016562 0ustar0000000000000000 -- | The main entry point that calls all the default rules module Development.Shake.Shake(shake) where import Development.Shake.Types import General.Timing import Development.Shake.Core import Development.Shake.Rules.Directory import Development.Shake.Rules.File import Development.Shake.Rules.Rerun -- | Main entry point for running Shake build systems. For an example see the top of the module "Development.Shake". -- Use 'ShakeOptions' to specify how the system runs, and 'Rules' to specify what to build. The function will throw -- an exception if the build fails. -- -- To use command line flags to modify 'ShakeOptions' see 'Development.Shake.shakeArgs'. shake :: ShakeOptions -> Rules () -> IO () shake opts r = do addTiming "Function shake" run opts $ do r defaultRuleFile defaultRuleDirectory defaultRuleRerun return () shake-0.15.5/src/Development/Shake/Rule.hs0000644000000000000000000000134612560222036016434 0ustar0000000000000000{-# LANGUAGE CPP #-} -- | This module is used for defining new types of rules for Shake build systems. -- Most users will find the built-in set of rules sufficient. module Development.Shake.Rule( #if __GLASGOW_HASKELL__ >= 704 ShakeValue, #endif Rule(..), EqualCost(..), rule, apply, apply1, trackUse, trackChange, trackAllow, -- * Deprecated defaultRule ) where import Development.Shake.Core import Development.Shake.Types {-# DEPRECATED defaultRule "Use 'rule' with 'priority' 0" #-} -- | A deprecated way of defining a low priority rule. Defined as: -- -- @ -- defaultRule = 'priority' 0 . 'rule' -- @ defaultRule :: Rule key value => (key -> Maybe (Action value)) -> Rules () defaultRule = priority 0 . rule shake-0.15.5/src/Development/Shake/Resource.hs0000644000000000000000000001572712560222036017324 0ustar0000000000000000{-# LANGUAGE RecordWildCards, ViewPatterns #-} module Development.Shake.Resource( Resource, newResourceIO, newThrottleIO, acquireResource, releaseResource ) where import Data.Function import System.IO.Unsafe import Control.Concurrent.Extra import Data.Tuple.Extra import Control.Monad import General.Bilist import Development.Shake.Pool import System.Time.Extra import Data.Monoid import Prelude {-# NOINLINE resourceIds #-} resourceIds :: Var Int resourceIds = unsafePerformIO $ newVar 0 resourceId :: IO Int resourceId = modifyVar resourceIds $ \i -> let j = i + 1 in j `seq` return (j, j) -- | A type representing an external resource which the build system should respect. There -- are two ways to create 'Resource's in Shake: -- -- * 'Development.Shake.newResource' creates a finite resource, stopping too many actions running -- simultaneously. -- -- * 'Development.Shake.newThrottle' creates a throttled resource, stopping too many actions running -- over a short time period. -- -- These resources are used with 'Development.Shake.withResource' when defining rules. Typically only -- system commands (such as 'Development.Shake.cmd') should be run inside 'Development.Shake.withResource', -- not commands such as 'Development.Shake.need'. -- -- Be careful that the actions run within 'Development.Shake.withResource' do not themselves require further -- resources, or you may get a \"thread blocked indefinitely in an MVar operation\" exception. -- If an action requires multiple resources, use 'Development.Shake.withResources' to avoid deadlock. data Resource = Resource {resourceOrd :: Int -- ^ Key used for Eq/Ord operations. To make withResources work, we require newResourceIO < newThrottleIO ,resourceShow :: String -- ^ String used for Show ,acquireResource :: Pool -> Int -> IO () -> IO () -- ^ Acquire the resource and call the function. ,releaseResource :: Pool -> Int -> IO () -- ^ You should only ever releaseResource that you obtained with acquireResource. } instance Show Resource where show = resourceShow instance Eq Resource where (==) = (==) `on` resourceOrd instance Ord Resource where compare = compare `on` resourceOrd --------------------------------------------------------------------- -- FINITE RESOURCES data Finite = Finite {finiteAvailable :: !Int -- ^ number of currently available resources ,finiteWaiting :: Bilist (Int, IO ()) -- ^ queue of people with how much they want and the action when it is allocated to them } -- | A version of 'Development.Shake.newResource' that runs in IO, and can be called before calling 'Development.Shake.shake'. -- Most people should use 'Development.Shake.newResource' instead. newResourceIO :: String -> Int -> IO Resource newResourceIO name mx = do when (mx < 0) $ error $ "You cannot create a resource named " ++ name ++ " with a negative quantity, you used " ++ show mx key <- resourceId var <- newVar $ Finite mx mempty return $ Resource (negate key) shw (acquire var) (release var) where shw = "Resource " ++ name acquire :: Var Finite -> Pool -> Int -> IO () -> IO () acquire var pool want continue | want < 0 = error $ "You cannot acquire a negative quantity of " ++ shw ++ ", requested " ++ show want | want > mx = error $ "You cannot acquire more than " ++ show mx ++ " of " ++ shw ++ ", requested " ++ show want | otherwise = join $ modifyVar var $ \x@Finite{..} -> return $ if want <= finiteAvailable then (x{finiteAvailable = finiteAvailable - want}, continue) else (x{finiteWaiting = finiteWaiting `snoc` (want, addPool pool continue)}, return ()) release :: Var Finite -> Pool -> Int -> IO () release var _ i = join $ modifyVar var $ \x -> return $ f x{finiteAvailable = finiteAvailable x + i} where f (Finite i (uncons -> Just ((wi,wa),ws))) | wi <= i = second (wa >>) $ f $ Finite (i-wi) ws | otherwise = first (add (wi,wa)) $ f $ Finite i ws f (Finite i _) = (Finite i mempty, return ()) add a s = s{finiteWaiting = a `cons` finiteWaiting s} --------------------------------------------------------------------- -- THROTTLE RESOURCES -- call a function after a certain delay waiter :: Seconds -> IO () -> IO () waiter period act = void $ forkIO $ do sleep period act -- Make sure the pool cannot run try until after you have finished with it blockPool :: Pool -> IO (IO ()) blockPool pool = do bar <- newBarrier addPool pool $ do cancel <- increasePool pool waitBarrier bar cancel return $ signalBarrier bar () data Throttle -- | Some number of resources are available = ThrottleAvailable !Int -- | Some users are blocked (non-empty), plus an action to call once we go back to Available | ThrottleWaiting (IO ()) (Bilist (Int, IO ())) -- | A version of 'Development.Shake.newThrottle' that runs in IO, and can be called before calling 'Development.Shake.shake'. -- Most people should use 'Development.Shake.newThrottle' instead. newThrottleIO :: String -> Int -> Double -> IO Resource newThrottleIO name count period = do when (count < 0) $ error $ "You cannot create a throttle named " ++ name ++ " with a negative quantity, you used " ++ show count key <- resourceId var <- newVar $ ThrottleAvailable count return $ Resource key shw (acquire var) (release var) where shw = "Throttle " ++ name acquire :: Var Throttle -> Pool -> Int -> IO () -> IO () acquire var pool want continue | want < 0 = error $ "You cannot acquire a negative quantity of " ++ shw ++ ", requested " ++ show want | want > count = error $ "You cannot acquire more than " ++ show count ++ " of " ++ shw ++ ", requested " ++ show want | otherwise = join $ modifyVar var $ \x -> case x of ThrottleAvailable i | i >= want -> return (ThrottleAvailable $ i - want, continue) | otherwise -> do stop <- blockPool pool return (ThrottleWaiting stop $ (want - i, addPool pool continue) `cons` mempty, return ()) ThrottleWaiting stop xs -> return (ThrottleWaiting stop $ xs `snoc` (want, addPool pool continue), return ()) release :: Var Throttle -> Pool -> Int -> IO () release var pool n = waiter period $ join $ modifyVar var $ \x -> return $ case x of ThrottleAvailable i -> (ThrottleAvailable $ i+n, return ()) ThrottleWaiting stop xs -> f stop n xs where f stop i (uncons -> Just ((wi,wa),ws)) | i >= wi = second (wa >>) $ f stop (i-wi) ws | otherwise = (ThrottleWaiting stop $ (wi-i,wa) `cons` ws, return ()) f stop i _ = (ThrottleAvailable i, stop) shake-0.15.5/src/Development/Shake/Progress.hs0000644000000000000000000003702012560222036017327 0ustar0000000000000000{-# LANGUAGE DeriveDataTypeable, RecordWildCards, CPP, ForeignFunctionInterface #-} {-# LANGUAGE ViewPatterns #-} -- | Progress tracking module Development.Shake.Progress( Progress(..), progressSimple, progressDisplay, progressTitlebar, progressProgram, ProgressEntry(..), progressReplay, writeProgressReport -- INTERNAL USE ONLY ) where import Control.Applicative import Data.Tuple.Extra import Control.Exception.Extra import Control.Monad import System.Environment import System.Directory import System.Process import System.FilePath import Data.Char import Data.Data import Data.IORef import Data.List import Data.Maybe import Data.Version import qualified Data.ByteString.Char8 as BS import qualified Data.ByteString.Lazy.Char8 as LBS import Numeric.Extra import General.Template import System.IO.Unsafe import Paths_shake import System.Time.Extra import Data.Monoid import Prelude #ifdef mingw32_HOST_OS import Foreign import Foreign.C.Types type LPCSTR = Ptr CChar foreign import stdcall "Windows.h SetConsoleTitleA" c_setConsoleTitle :: LPCSTR -> IO Bool #endif --------------------------------------------------------------------- -- PROGRESS TYPES - exposed to the user -- | Information about the current state of the build, obtained by passing a callback function -- to 'Development.Shake.shakeProgress'. Typically a program will use 'progressDisplay' to poll this value and produce -- status messages, which is implemented using this data type. data Progress = Progress {isFailure :: !(Maybe String) -- ^ Starts out 'Nothing', becomes 'Just' a target name if a rule fails. ,countSkipped :: {-# UNPACK #-} !Int -- ^ Number of rules which were required, but were already in a valid state. ,countBuilt :: {-# UNPACK #-} !Int -- ^ Number of rules which were have been built in this run. ,countUnknown :: {-# UNPACK #-} !Int -- ^ Number of rules which have been built previously, but are not yet known to be required. ,countTodo :: {-# UNPACK #-} !Int -- ^ Number of rules which are currently required (ignoring dependencies that do not change), but not built. ,timeSkipped :: {-# UNPACK #-} !Double -- ^ Time spent building 'countSkipped' rules in previous runs. ,timeBuilt :: {-# UNPACK #-} !Double -- ^ Time spent building 'countBuilt' rules. ,timeUnknown :: {-# UNPACK #-} !Double -- ^ Time spent building 'countUnknown' rules in previous runs. ,timeTodo :: {-# UNPACK #-} !(Double,Int) -- ^ Time spent building 'countTodo' rules in previous runs, plus the number which have no known time (have never been built before). } deriving (Eq,Ord,Show,Read,Data,Typeable) instance Monoid Progress where mempty = Progress Nothing 0 0 0 0 0 0 0 (0,0) mappend a b = Progress {isFailure = isFailure a `mplus` isFailure b ,countSkipped = countSkipped a + countSkipped b ,countBuilt = countBuilt a + countBuilt b ,countUnknown = countUnknown a + countUnknown b ,countTodo = countTodo a + countTodo b ,timeSkipped = timeSkipped a + timeSkipped b ,timeBuilt = timeBuilt a + timeBuilt b ,timeUnknown = timeUnknown a + timeUnknown b ,timeTodo = let (a1,a2) = timeTodo a; (b1,b2) = timeTodo b x1 = a1 + b1; x2 = a2 + b2 in x1 `seq` x2 `seq` (x1,x2) } --------------------------------------------------------------------- -- MEALY TYPE - for writing the progress functions -- See -- | A machine that takes inputs and produces outputs newtype Mealy i a = Mealy {runMealy :: i -> (a, Mealy i a)} instance Functor (Mealy i) where fmap f (Mealy m) = Mealy $ \i -> case m i of (x, m) -> (f x, fmap f m) instance Applicative (Mealy i) where pure x = let r = Mealy (const (x, r)) in r Mealy mf <*> Mealy mx = Mealy $ \i -> case mf i of (f, mf) -> case mx i of (x, mx) -> (f x, mf <*> mx) echoMealy :: Mealy i i echoMealy = Mealy $ \i -> (i, echoMealy) scanMealy :: (a -> b -> a) -> a -> Mealy i b -> Mealy i a scanMealy f z (Mealy m) = Mealy $ \i -> case m i of (x, m) -> let z2 = f z x in (z2, scanMealy f z2 m) --------------------------------------------------------------------- -- MEALY UTILITIES oldMealy :: a -> Mealy i a -> Mealy i (a,a) oldMealy old = scanMealy (\(_,old) new -> (old,new)) (old,old) latch :: Mealy i (Bool, a) -> Mealy i a latch s = fromJust <$> scanMealy f Nothing s where f old (b,v) = Just $ if b then fromMaybe v old else v iff :: Mealy i Bool -> Mealy i a -> Mealy i a -> Mealy i a iff c t f = (\c t f -> if c then t else f) <$> c <*> t <*> f -- decay'd division, compute a/b, with a decay of f -- r' is the new result, r is the last result -- r' ~= a' / b' -- r' = r*b + f*(a'-a) -- ------------- -- b + f*(b'-b) -- when f == 1, r == r' -- -- both streams must only ever increase decay :: Double -> Mealy i Double -> Mealy i Double -> Mealy i Double decay f a b = scanMealy step 0 $ (,) <$> oldMealy 0 a <*> oldMealy 0 b where step r ((a,a'),(b,b')) = if isNaN r then a' / b' else ((r*b) + f*(a'-a)) / (b + f*(b'-b)) --------------------------------------------------------------------- -- MESSAGE GENERATOR formatMessage :: Double -> Double -> String formatMessage secs perc = (if isNaN secs || secs < 0 then "??s" else showMinSec $ ceiling secs) ++ " (" ++ (if isNaN perc || perc < 0 || perc > 100 then "??" else show $ floor perc) ++ "%)" showMinSec :: Int -> String showMinSec secs = (if m == 0 then "" else show m ++ "m" ++ ['0' | s < 10]) ++ show s ++ "s" where (m,s) = divMod secs 60 liftA2' :: Applicative m => m a -> m b -> (a -> b -> c) -> m c liftA2' a b f = liftA2 f a b -- | return (number of seconds, percentage, explanation) message :: Mealy (Double, Progress) (Double, Progress) -> Mealy (Double, Progress) (Double, Double, String) message input = liftA3 (,,) time perc debug where progress = snd <$> input secs = fst <$> input debug = (\donePerSec ruleTime (todoKnown,todoUnknown) -> "Progress: " ++ "((known=" ++ showDP 2 todoKnown ++ "s) + " ++ "(unknown=" ++ show todoUnknown ++ " * time=" ++ showDP 2 ruleTime ++ "s)) " ++ "(rate=" ++ showDP 2 donePerSec ++ "))") <$> donePerSec <*> ruleTime <*> (timeTodo <$> progress) -- Number of seconds work completed in this build run -- Ignores timeSkipped which would be more truthful, but it makes the % drop sharply -- which isn't what users want done = timeBuilt <$> progress -- Work done per second, don't divide by 0 and don't update if 'done' doesn't change donePerSec = iff ((==) 0 <$> done) (pure 1) perSecStable where perSecStable = latch $ liftA2 (,) (uncurry (==) <$> oldMealy 0 done) perSecRaw perSecRaw = decay 1.2 done secs -- Predicted build time for a rule that has never been built before -- The high decay means if a build goes in "phases" - lots of source files, then lots of compiling -- we reach a reasonable number fairly quickly, without bouncing too much ruleTime = liftA2 weightedAverage (f (decay 10) timeBuilt countBuilt) (f (liftA2 (/)) (fst . timeTodo) (\Progress{..} -> countTodo - snd timeTodo)) -- don't call decay on todo, since it goes up and down (as things get done) where weightedAverage (w1,x1) (w2,x2) | w1 == 0 && w2 == 0 = 0 | otherwise = ((w1 *. x1) + (w2 *. x2)) / intToDouble (w1+w2) where i *. d = if i == 0 then 0 else intToDouble i * d -- since d might be NaN f divide time count = let xs = count <$> progress in liftA2 (,) xs $ divide (time <$> progress) (intToDouble <$> xs) -- Number of seconds work remaining, ignoring multiple threads todo = f <$> progress <*> ruleTime where f Progress{..} ruleTime = fst timeTodo + (fromIntegral (snd timeTodo) * ruleTime) -- Display information time = liftA2 (/) todo donePerSec perc = iff ((==) 0 <$> done) (pure 0) $ liftA2' done todo $ \done todo -> 100 * done / (done + todo) --------------------------------------------------------------------- -- EXPOSED FUNCTIONS -- | Given a sampling interval (in seconds) and a way to display the status message, -- produce a function suitable for using as 'Development.Shake.shakeProgress'. -- This function polls the progress information every /n/ seconds, produces a status -- message and displays it using the display function. -- -- Typical status messages will take the form of @1m25s (15%)@, indicating that the build -- is predicted to complete in 1 minute 25 seconds (85 seconds total), and 15% of the necessary build time has elapsed. -- This function uses past observations to predict future behaviour, and as such, is only -- guessing. The time is likely to go up as well as down, and will be less accurate from a -- clean build (as the system has fewer past observations). -- -- The current implementation is to predict the time remaining (based on 'timeTodo') and the -- work already done ('timeBuilt'). The percentage is then calculated as @remaining / (done + remaining)@, -- while time left is calculated by scaling @remaining@ by the observed work rate in this build, -- roughly @done / time_elapsed@. progressDisplay :: Double -> (String -> IO ()) -> IO Progress -> IO () progressDisplay sample disp prog = do disp "Starting..." -- no useful info at this stage time <- offsetTime catchJust (\x -> if x == ThreadKilled then Just () else Nothing) (loop time $ message echoMealy) (const $ disp "Finished") where loop :: IO Double -> Mealy (Double, Progress) (Double, Double, String) -> IO () loop time mealy = do sleep sample p <- prog t <- time ((secs,perc,debug), mealy) <- return $ runMealy mealy (t, p) -- putStrLn debug disp $ formatMessage secs perc ++ maybe "" (\err -> ", Failure! " ++ err) (isFailure p) loop time mealy data ProgressEntry = ProgressEntry {idealSecs :: Double, idealPerc :: Double ,actualSecs :: Double, actualPerc :: Double } isInvalid :: ProgressEntry -> Bool isInvalid ProgressEntry{..} = isNaN actualSecs || isNaN actualPerc -- | Given a list of progress inputs, what would you have suggested (seconds, percentage) progressReplay :: [(Double, Progress)] -> [ProgressEntry] progressReplay [] = [] progressReplay ps = snd $ mapAccumL f (message echoMealy) ps where end = fst $ last ps f a (time,p) = (a2, ProgressEntry (end - time) (time * 100 / end) secs perc) where ((secs,perc,_),a2) = runMealy a (time,p) -- | Given a trace, display information about how well we did writeProgressReport :: FilePath -> [(FilePath, [(Double, Progress)])] -> IO () writeProgressReport out (map (second progressReplay) -> xs) | (bad,_):_ <- filter (any isInvalid . snd) xs = errorIO $ "Progress generates NaN for " ++ bad | takeExtension out == ".js" = writeFile out $ "var shake = \n" ++ generateJSON xs | takeExtension out == ".json" = writeFile out $ generateJSON xs | out == "-" = putStr $ unlines $ generateSummary xs | otherwise = LBS.writeFile out =<< generateHTML xs generateSummary :: [(FilePath, [ProgressEntry])] -> [String] generateSummary xs = flip concatMap xs $ \(file,xs) -> ["# " ++ file, f xs "Seconds" idealSecs actualSecs, f xs "Percent" idealPerc actualPerc] where levels = [100,90,80,50] f xs lbl ideal actual = lbl ++ ": " ++ intercalate ", " [show l ++ "% within " ++ show (ceiling $ maximum $ 0 : take ((length xs * l) `div` 100) diff) | l <- levels] where diff = sort [abs $ ideal x - actual x | x <- xs] generateHTML :: [(FilePath, [ProgressEntry])] -> IO LBS.ByteString generateHTML xs = do htmlDir <- getDataFileName "html" report <- LBS.readFile $ htmlDir "progress.html" let f name | name == "progress-data.js" = return $ LBS.pack $ "var shake =\n" ++ generateJSON xs | name == "version.js" = return $ LBS.pack $ "var version = " ++ show (showVersion version) | otherwise = LBS.readFile $ htmlDir name runTemplate f report generateJSON :: [(FilePath, [ProgressEntry])] -> String generateJSON = concat . jsonList . map ((++"}") . unlines . f) where f (file,ps) = ("{\"name\":" ++ show (takeFileName file) ++ ", \"values\":") : indent (jsonList $ map g ps) shw = showDP 1 g ProgressEntry{..} = jsonObject [("idealSecs",shw idealSecs),("idealPerc",shw idealPerc) ,("actualSecs",shw actualSecs),("actualPerc",shw actualPerc)] indent = map (" "++) jsonList xs = zipWith (:) ('[':repeat ',') xs ++ ["]"] jsonObject xs = "{" ++ intercalate ", " [show a ++ ":" ++ b | (a,b) <- xs] ++ "}" {-# NOINLINE xterm #-} xterm :: Bool xterm = System.IO.Unsafe.unsafePerformIO $ -- Terminal.app uses "xterm-256color" as its env variable catch_ (fmap ("xterm" `isPrefixOf`) $ getEnv "TERM") $ \e -> return False -- | Set the title of the current console window to the given text. If the -- environment variable @$TERM@ is set to @xterm@ this uses xterm escape sequences. -- On Windows, if not detected as an xterm, this function uses the @SetConsoleTitle@ API. progressTitlebar :: String -> IO () progressTitlebar x | xterm = BS.putStr $ BS.pack $ "\ESC]0;" ++ x ++ "\BEL" #ifdef mingw32_HOST_OS | otherwise = BS.useAsCString (BS.pack x) $ \x -> c_setConsoleTitle x >> return () #else | otherwise = return () #endif -- | Call the program @shake-progress@ if it is on the @$PATH@. The program is called with -- the following arguments: -- -- * @--title=string@ - the string passed to @progressProgram@. -- -- * @--state=Normal@, or one of @NoProgress@, @Normal@, or @Error@ to indicate -- what state the progress bar should be in. -- -- * @--value=25@ - the percent of the build that has completed, if not in @NoProgress@ state. -- -- The program will not be called consecutively with the same @--state@ and @--value@ options. -- -- Windows 7 or higher users can get taskbar progress notifications by placing the following -- program in their @$PATH@: . progressProgram :: IO (String -> IO ()) progressProgram = do exe <- findExecutable "shake-progress" case exe of Nothing -> return $ const $ return () Just exe -> do ref <- newIORef Nothing return $ \msg -> do let failure = " Failure! " `isInfixOf` msg let perc = let (a,b) = break (== '%') msg in if null b then "" else reverse $ takeWhile isDigit $ reverse a let key = (failure, perc) same <- atomicModifyIORef ref $ \old -> (Just key, old == Just key) let state = if perc == "" then "NoProgress" else if failure then "Error" else "Normal" rawSystem exe $ ["--title=" ++ msg, "--state=" ++ state] ++ ["--value=" ++ perc | perc /= ""] return () -- | A simple method for displaying progress messages, suitable for using as 'Development.Shake.shakeProgress'. -- This function writes the current progress to the titlebar every five seconds using 'progressTitlebar', -- and calls any @shake-progress@ program on the @$PATH@ using 'progressProgram'. progressSimple :: IO Progress -> IO () progressSimple p = do program <- progressProgram progressDisplay 5 (\s -> progressTitlebar s >> program s) p shake-0.15.5/src/Development/Shake/Profile.hs0000644000000000000000000001056012560222036017123 0ustar0000000000000000{-# LANGUAGE PatternGuards, RecordWildCards #-} module Development.Shake.Profile(ProfileEntry(..), ProfileTrace(..), writeProfile) where import General.Template import Data.Tuple.Extra import Data.Function import Data.List import Data.Version import System.FilePath import Numeric.Extra import Paths_shake import System.Time.Extra import qualified Data.ByteString.Lazy.Char8 as LBS data ProfileEntry = ProfileEntry {prfName :: String, prfBuilt :: Int, prfChanged :: Int, prfDepends :: [Int], prfExecution :: Double, prfTraces :: [ProfileTrace]} data ProfileTrace = ProfileTrace {prfCommand :: String, prfStart :: Double, prfStop :: Double} prfTime ProfileTrace{..} = prfStop - prfStart -- | Generates an report given some build system profiling data. writeProfile :: FilePath -> [ProfileEntry] -> IO () writeProfile out xs | takeExtension out == ".js" = writeFile out $ "var shake = \n" ++ generateJSON xs | takeExtension out == ".json" = writeFile out $ generateJSON xs | takeExtension out == ".trace" = writeFile out $ generateTrace xs | out == "-" = putStr $ unlines $ generateSummary xs | otherwise = LBS.writeFile out =<< generateHTML xs generateSummary :: [ProfileEntry] -> [String] generateSummary xs = ["* This database has tracked " ++ show (maximum (0 : map prfChanged xs) + 1) ++ " runs." ,let f = show . length in "* There are " ++ f xs ++ " rules (" ++ f ls ++ " rebuilt in the last run)." ,let f = show . sum . map (length . prfTraces) in "* Building required " ++ f xs ++ " traced commands (" ++ f ls ++ " in the last run)." ,"* The total (unparallelised) time is " ++ showDuration (sum $ map prfExecution xs) ++ " of which " ++ showDuration (sum $ map prfTime $ concatMap prfTraces xs) ++ " is traced commands." ,let f xs = if null xs then "0s" else (\(a,b) -> showDuration a ++ " (" ++ b ++ ")") $ maximumBy (compare `on` fst) xs in "* The longest rule takes " ++ f (map (prfExecution &&& prfName) xs) ++ ", and the longest traced command takes " ++ f (map (prfTime &&& prfCommand) $ concatMap prfTraces xs) ++ "." ,let sumLast = sum $ map prfTime $ concatMap prfTraces ls maxStop = maximum $ 0 : map prfStop (concatMap prfTraces ls) in "* Last run gave an average parallelism of " ++ showDP 2 (if maxStop == 0 then 0 else sumLast / maxStop) ++ " times over " ++ showDuration(maxStop) ++ "." ] where ls = filter ((==) 0 . prfBuilt) xs generateHTML :: [ProfileEntry] -> IO LBS.ByteString generateHTML xs = do htmlDir <- getDataFileName "html" report <- LBS.readFile $ htmlDir "profile.html" let f name | name == "profile-data.js" = return $ LBS.pack $ "var shake =\n" ++ generateJSON xs | name == "version.js" = return $ LBS.pack $ "var version = " ++ show (showVersion version) | otherwise = LBS.readFile $ htmlDir name runTemplate f report generateTrace :: [ProfileEntry] -> String generateTrace xs = jsonListLines $ showEntries 0 [y{prfCommand=prfName x} | x <- xs, y <- prfTraces x] ++ showEntries 1 (concatMap prfTraces xs) where showEntries pid xs = map (showEntry pid) $ snd $ mapAccumL alloc [] $ sortBy (compare `on` prfStart) xs alloc as r | (a1,an:a2) <- break (\a -> prfStop a <= prfStart r) as = (a1++r:a2, (length a1,r)) | otherwise = (as++[r], (length as,r)) showEntry pid (tid, ProfileTrace{..}) = jsonObject [("args","{}"), ("ph",show "X"), ("cat",show "target") ,("name",show prfCommand), ("tid",show tid), ("pid",show pid) ,("ts",show $ 1000000*prfStart), ("dur",show $ 1000000*(prfStop-prfStart))] generateJSON :: [ProfileEntry] -> String generateJSON = jsonListLines . map showEntry where showEntry ProfileEntry{..} = jsonObject $ [("name", show prfName) ,("built", show prfBuilt) ,("changed", show prfChanged) ,("depends", show prfDepends) ,("execution", show prfExecution)] ++ [("traces", jsonList $ map showTrace prfTraces) | not $ null prfTraces] showTrace ProfileTrace{..} = jsonObject [("command",show prfCommand), ("start",show prfStart), ("stop",show prfStop)] jsonListLines xs = "[" ++ intercalate "\n," xs ++ "\n]" jsonList xs = "[" ++ intercalate "," xs ++ "]" jsonObject xs = "{" ++ intercalate ", " [show a ++ ":" ++ b | (a,b) <- xs] ++ "}" shake-0.15.5/src/Development/Shake/Pool.hs0000644000000000000000000001454412560222036016442 0ustar0000000000000000 -- | Thread pool implementation. module Development.Shake.Pool( Pool, runPool, addPool, addPoolPriority, increasePool ) where import Control.Concurrent.Extra import Control.Exception import Control.Monad import General.Timing import qualified Data.HashSet as Set import System.IO.Unsafe import System.Random --------------------------------------------------------------------- -- UNFAIR/RANDOM QUEUE -- Monad for non-deterministic (but otherwise pure) computations type NonDet a = IO a nonDet :: NonDet [Bool] nonDet = do bs <- unsafeInterleaveIO nonDet b <- randomIO return $ b:bs -- Left = deterministic list, Right = non-deterministic tree data Queue a = Queue [a] (Either [a] (Maybe (Tree a))) newQueue :: Bool -> Queue a newQueue deterministic = Queue [] $ if deterministic then Left [] else Right Nothing enqueuePriority :: a -> Queue a -> Queue a enqueuePriority x (Queue p t) = Queue (x:p) t enqueue :: a -> Queue a -> NonDet (Queue a) enqueue x (Queue p (Left xs)) = return $ Queue p $ Left $ x:xs enqueue x (Queue p (Right Nothing)) = return $ Queue p $ Right $ Just $ Leaf x enqueue x (Queue p (Right (Just t))) = do bs <- nonDet; return $ Queue p $ Right $ Just $ insertTree bs x t dequeue :: Queue a -> Maybe (NonDet (a, Queue a)) dequeue (Queue (p:ps) t) = Just $ return (p, Queue ps t) dequeue (Queue [] (Left (x:xs))) = Just $ return (x, Queue [] $ Left xs) dequeue (Queue [] (Left [])) = Nothing dequeue (Queue [] (Right (Just t))) = Just $ do bs <- nonDet; (x,t) <- return $ removeTree bs t; return (x, Queue [] $ Right t) dequeue (Queue [] (Right Nothing)) = Nothing --------------------------------------------------------------------- -- TREE -- Note that for a Random tree, since everything is Random, Branch x y =~= Branch y x data Tree a = Leaf a | Branch (Tree a) (Tree a) insertTree :: [Bool] -> a -> Tree a -> Tree a insertTree _ x (Leaf y) = Branch (Leaf x) (Leaf y) insertTree (b:bs) x (Branch y z) = if b then f y z else f z y where f y z = Branch y (insertTree bs x z) removeTree :: [Bool] -> Tree a -> (a, Maybe (Tree a)) removeTree _ (Leaf x) = (x, Nothing) removeTree (b:bs) (Branch y z) = if b then f y z else f z y where f y z = case removeTree bs z of (x, Nothing) -> (x, Just y) (x, Just z) -> (x, Just $ Branch y z) --------------------------------------------------------------------- -- THREAD POOL {- Must keep a list of active threads, so can raise exceptions in a timely manner If any worker throws an exception, must signal to all the other workers -} data Pool = Pool !(Var (Maybe S)) -- Current state, 'Nothing' to say we are aborting !(Barrier (Either SomeException S)) -- Barrier to signal that we are data S = S {threads :: !(Set.HashSet ThreadId) -- IMPORTANT: Must be strict or we leak thread stacks ,threadsLimit :: {-# UNPACK #-} !Int -- user supplied thread limit, Set.size threads <= threadsLimit ,threadsMax :: {-# UNPACK #-} !Int -- high water mark of Set.size threads (accounting only) ,threadsSum :: {-# UNPACK #-} !Int -- number of threads we have been through (accounting only) ,todo :: !(Queue (IO ())) -- operations waiting a thread } emptyS :: Int -> Bool -> S emptyS n deterministic = S Set.empty n 0 0 $ newQueue deterministic -- | Given a pool, and a function that breaks the S invariants, restore them -- They are only allowed to touch threadsLimit or todo step :: Pool -> (S -> NonDet S) -> IO () step pool@(Pool var done) op = do let onVar act = modifyVar_ var $ maybe (return Nothing) act onVar $ \s -> do s <- op s res <- maybe (return Nothing) (fmap Just) $ dequeue $ todo s case res of Just (now, todo2) | Set.size (threads s) < threadsLimit s -> do -- spawn a new worker t <- forkIO $ do t <- myThreadId res <- try now case res of Left e -> onVar $ \s -> do mapM_ killThread $ Set.toList $ Set.delete t $ threads s signalBarrier done $ Left e return Nothing Right _ -> step pool $ \s -> return s{threads = Set.delete t $ threads s} let threads2 = Set.insert t $ threads s return $ Just s{todo = todo2, threads = threads2 ,threadsSum = threadsSum s + 1, threadsMax = threadsMax s `max` Set.size threads2} Nothing | Set.null $ threads s -> do signalBarrier done $ Right s return Nothing _ -> return $ Just s -- | Add a new task to the pool, may be cancelled by sending it an exception addPool :: Pool -> IO a -> IO () addPool pool act = step pool $ \s -> do todo <- enqueue (void act) (todo s) return s{todo = todo} -- | Add a new task to the pool, may be cancelled by sending it an exception. -- Takes priority over everything else. addPoolPriority :: Pool -> IO a -> IO () addPoolPriority pool act = step pool $ \s -> do todo <- return $ enqueuePriority (void act) (todo s) return s{todo = todo} -- | Temporarily increase the pool by 1 thread. Call the cleanup action to restore the value. -- After calling cleanup you should requeue onto a new thread. increasePool :: Pool -> IO (IO ()) increasePool pool = do step pool $ \s -> return s{threadsLimit = threadsLimit s + 1} return $ step pool $ \s -> return s{threadsLimit = threadsLimit s - 1} -- | Run all the tasks in the pool on the given number of works. -- If any thread throws an exception, the exception will be reraised. runPool :: Bool -> Int -> (Pool -> IO ()) -> IO () -- run all tasks in the pool runPool deterministic n act = do s <- newVar $ Just $ emptyS n deterministic let cleanup = modifyVar_ s $ \s -> do -- if someone kills our thread, make sure we kill our child threads case s of Just s -> mapM_ killThread $ Set.toList $ threads s Nothing -> return () return Nothing flip onException cleanup $ do res <- newBarrier let pool = Pool s res addPool pool $ act pool res <- waitBarrier res case res of Left e -> throw e Right s -> addTiming $ "Pool finished (" ++ show (threadsSum s) ++ " threads, " ++ show (threadsMax s) ++ " max)" shake-0.15.5/src/Development/Shake/Monad.hs0000644000000000000000000000674112560222036016567 0ustar0000000000000000{-# LANGUAGE GeneralizedNewtypeDeriving #-} module Development.Shake.Monad( RAW, Capture, runRAW, getRO, getRW, getsRO, getsRW, putRW, modifyRW, withRO, withRW, catchRAW, tryRAW, throwRAW, unmodifyRW, captureRAW, ) where import Control.Exception.Extra import Control.Monad.IO.Class import Control.Monad.Trans.Cont import Control.Monad.Trans.Reader import Data.IORef import Control.Applicative import Prelude data S ro rw = S {handler :: IORef (SomeException -> IO ()) ,ro :: ro ,rww :: IORef rw -- Read/Write Writeable var (rww) } newtype RAW ro rw a = RAW {fromRAW :: ReaderT (S ro rw) (ContT () IO) a} deriving (Functor, Applicative, Monad, MonadIO) type Capture a = (a -> IO ()) -> IO () -- | Run and then call a continuation. runRAW :: ro -> rw -> RAW ro rw a -> Capture (Either SomeException a) runRAW ro rw m k = do rww <- newIORef rw handler <- newIORef $ k . Left fromRAW m `runReaderT` S handler ro rww `runContT` (k . Right) `catch_` \e -> ($ e) =<< readIORef handler --------------------------------------------------------------------- -- STANDARD getRO :: RAW ro rw ro getRO = RAW $ asks ro getRW :: RAW ro rw rw getRW = RAW $ liftIO . readIORef =<< asks rww getsRO :: (ro -> a) -> RAW ro rw a getsRO f = fmap f getRO getsRW :: (rw -> a) -> RAW ro rw a getsRW f = fmap f getRW -- | Strict version putRW :: rw -> RAW ro rw () putRW rw = rw `seq` RAW $ liftIO . flip writeIORef rw =<< asks rww withRAW :: (S ro rw -> S ro2 rw2) -> RAW ro2 rw2 a -> RAW ro rw a withRAW f m = RAW $ withReaderT f $ fromRAW m modifyRW :: (rw -> rw) -> RAW ro rw () modifyRW f = do x <- getRW; putRW $ f x withRO :: (ro -> ro2) -> RAW ro2 rw a -> RAW ro rw a withRO f = withRAW $ \s -> s{ro=f $ ro s} withRW :: (rw -> rw2) -> RAW ro rw2 a -> RAW ro rw a withRW f m = do rw <- getRW rww <- liftIO $ newIORef $ f rw withRAW (\s -> s{rww=rww}) m --------------------------------------------------------------------- -- EXCEPTIONS catchRAW :: RAW ro rw a -> (SomeException -> RAW ro rw a) -> RAW ro rw a catchRAW m hdl = RAW $ ReaderT $ \s -> ContT $ \k -> do old <- readIORef $ handler s writeIORef (handler s) $ \e -> do writeIORef (handler s) old fromRAW (hdl e) `runReaderT` s `runContT` k `catch_` \e -> ($ e) =<< readIORef (handler s) fromRAW m `runReaderT` s `runContT` \v -> do writeIORef (handler s) old k v tryRAW :: RAW ro rw a -> RAW ro rw (Either SomeException a) tryRAW m = catchRAW (fmap Right m) (return . Left) throwRAW :: Exception e => e -> RAW ro rw a throwRAW = liftIO . throwIO --------------------------------------------------------------------- -- WEIRD STUFF -- | Apply a modification, run an action, then undo the changes after. unmodifyRW :: (rw -> (rw, rw -> rw)) -> RAW ro rw a -> RAW ro rw a unmodifyRW f m = do (s2,undo) <- fmap f getRW putRW s2 res <- m modifyRW undo return res -- | Capture a continuation. The continuation should be called at most once. -- Calling the same continuation, multiple times, in parallel, results in incorrect behaviour. captureRAW :: Capture (Either SomeException a) -> RAW ro rw a captureRAW f = RAW $ ReaderT $ \s -> ContT $ \k -> do old <- readIORef (handler s) writeIORef (handler s) throwIO f $ \x -> case x of Left e -> old e Right v -> do writeIORef (handler s) old k v `catch_` \e -> ($ e) =<< readIORef (handler s) writeIORef (handler s) throwIO shake-0.15.5/src/Development/Shake/FilePattern.hs0000644000000000000000000001641412560222036017744 0ustar0000000000000000{-# LANGUAGE PatternGuards, ViewPatterns #-} module Development.Shake.FilePattern( FilePattern, (?==), (), compatible, simple, extract, substitute, directories, directories1 ) where import System.FilePath(isPathSeparator, pathSeparators, pathSeparator) import Data.List.Extra import Data.Tuple.Extra --------------------------------------------------------------------- -- BASIC FILE PATTERN MATCHING -- | A type synonym for file patterns, containing @\/\/@ and @*@. For the syntax -- and semantics of 'FilePattern' see '?=='. -- -- Most 'normaliseEx'd 'FilePath' values are suitable as 'FilePattern' values which match -- only that specific file. On Windows @\\@ is treated as equivalent to @\/@. -- -- You can write 'FilePattern' values as a literal string, or build them -- up using the operators 'Development.Shake.FilePath.<.>', 'Development.Shake.FilePath.' -- and 'Development.Shake.'. However, beware that: -- -- * On Windows, use 'Development.Shake.FilePath.<.>' from "Development.Shake.FilePath" instead of from -- "System.FilePath" - otherwise @\"\/\/*\" \<.\> exe@ results in @\"\/\/*\\\\.exe\"@. -- -- * If the second argument of 'Development.Shake.FilePath.' has a leading path separator (namely @\/@) -- then the second argument will be returned. type FilePattern = String data Lexeme = Star | SlashSlash | Char Char deriving (Show, Eq) isChar (Char _) = True; isChar _ = False isDull (Char x) = not $ isPathSeparator x; isDull _ = False fromChar (Char x) = x data Regex = Lit [Char] | Not [Char] | Any | Start | End | Bracket Regex | Or Regex Regex | Concat Regex Regex | Repeat Regex | Empty deriving Show type SString = (Bool, String) -- fst is True if at the start of the string lexer :: FilePattern -> [Lexeme] lexer ('*':xs) = Star : lexer xs lexer (s1:s2:xs) | isPathSeparator s1 && isPathSeparator s2 = SlashSlash : lexer xs lexer (x:xs) = Char x : lexer xs lexer [] = [] pattern :: [Lexeme] -> Regex pattern = Concat Start . foldr Concat End . map f where f Star = Bracket $ Repeat $ Not pathSeparators f SlashSlash = let s = Start `Or` End `Or` Lit pathSeparators in Bracket $ Or (s `Concat` Repeat Any `Concat` s) (Lit pathSeparators) f (Char x) = Lit $ if isPathSeparator x then pathSeparators else [x] -- | Return is (brackets, matched, rest) match :: Regex -> SString -> [([String], String, SString)] match (Lit l) (_, x:xs) | x `elem` l = [([], [x], (False, xs))] match (Not l) (_, x:xs) | x `notElem` l = [([], [x], (False, xs))] match Any (_, x:xs) = [([], [x], (False, xs))] match Start (True, xs) = [([], [], (True, xs))] match End (s, []) = [([], [], (s, []))] match (Bracket r) xs = [(a ++ [b], b, c) | (a,b,c) <- match r xs] match (Or r1 r2) xs = match r1 xs ++ match r2 xs match (Concat r1 r2) xs = [(a1++a2,b1++b2,c2) | (a1,b1,c1) <- match r1 xs, (a2,b2,c2) <- match r2 c1] match (Repeat r) xs = match (Empty `Or` Concat r (Repeat r)) xs match Empty xs = [([], "", xs)] match _ _ = [] -- | Match a 'FilePattern' against a 'FilePath', There are only two special forms: -- -- * @*@ matches an entire path component, excluding any separators. -- -- * @\/\/@ matches an arbitrary number of path components. -- -- Some examples: -- -- * @test.c@ matches @test.c@ and nothing else. -- -- * @*.c@ matches all @.c@ files in the current directory, so @file.c@ matches, -- but @file.h@ and @dir\/file.c@ don't. -- -- * @\/\/*.c@ matches all @.c@ files in the current directory or its subdirectories, -- so @file.c@, @dir\/file.c@ and @dir1\/dir2\/file.c@ all match, but @file.h@ and -- @dir\/file.h@ don't. -- -- * @dir\/*\/*@ matches all files one level below @dir@, so @dir\/one\/file.c@ and -- @dir\/two\/file.h@ match, but @file.c@, @one\/dir\/file.c@, @dir\/file.h@ -- and @dir\/one\/two\/file.c@ don't. -- -- Patterns with constructs such as @foo\/..\/bar@ will never match -- normalised 'FilePath' values, so are unlikely to be correct. (?==) :: FilePattern -> FilePath -> Bool (?==) [s1,s2,'*'] | isPathSeparator s1 && isPathSeparator s2 = const True (?==) p = \x -> not $ null $ match pat (True, x) where pat = pattern $ lexer p infixr 5 -- | Join two 'FilePattern' values by inserting two @\/@ characters between them. -- Will first remove any trailing path separators on the first argument, and any leading -- separators on the second. -- -- > "dir" "*" == "dir//*" () :: FilePattern -> FilePattern -> FilePattern a b = dropWhileEnd isPathSeparator a ++ "//" ++ dropWhile isPathSeparator b --------------------------------------------------------------------- -- DIRECTORY PATTERNS -- | Given a pattern, return the directory that requires searching, -- with 'True' if it requires a recursive search. Must be conservative. -- Examples: -- -- > directories1 "*.xml" == ("",False) -- > directories1 "//*.xml" == ("",True) -- > directories1 "foo//*.xml" == ("foo",True) -- > directories1 "foo/bar/*.xml" == ("foo/bar",False) -- > directories1 "*/bar/*.xml" == ("",True) directories1 :: FilePattern -> (FilePath, Bool) directories1 = first (intercalate [pathSeparator]) . f . lexer where f xs | (a@(_:_),b:bs) <- span isDull xs, b `elem` (SlashSlash:map Char pathSeparators) = if b == SlashSlash then ([map fromChar a],True) else first (map fromChar a:) $ f bs | all (\x -> isDull x || x == Star) xs = ([],False) | otherwise = ([], True) -- | Given a set of patterns, produce a set of directories that require searching, -- with 'True' if it requires a recursive search. Must be conservative. Examples: -- -- > directories ["*.xml","//*.c"] == [("",True)] -- > directories ["bar/*.xml","baz//*.c"] == [("bar",False),("baz",True)] -- > directories ["bar/*.xml","baz//*.c"] == [("bar",False),("baz",True)] directories :: [FilePattern] -> [(FilePath,Bool)] directories ps = foldl f xs xs where xs = nubOrd $ map directories1 ps -- Eliminate anything which is a strict subset f xs (x,True) = filter (\y -> not $ (x,False) == y || x `isPrefixSlashOf` fst y) xs f xs _ = xs isPrefixSlashOf x (stripPrefix x -> Just (s1:_)) = isPathSeparator s1 isPrefixSlashOf _ _ = False --------------------------------------------------------------------- -- MULTIPATTERN COMPATIBLE SUBSTITUTIONS -- | Is the pattern free from any * and //. simple :: FilePattern -> Bool simple = all isChar . lexer -- | Do they have the same * and // counts in the same order compatible :: [FilePattern] -> Bool compatible [] = True compatible (x:xs) = all ((==) (f x) . f) xs where f = filter (not . isChar) . lexer -- | Extract the items that match the wildcards. The pair must match with '?=='. extract :: FilePattern -> FilePath -> [String] extract p x = ms where (ms,_,_):_ = match (pattern $ lexer p) (True,x) -- | Given the result of 'extract', substitute it back in to a 'compatible' pattern. -- -- > p '?==' x ==> substitute (extract p x) p == x substitute :: [String] -> FilePattern -> FilePath substitute ms p = f ms (lexer p) where f ms (Char p:ps) = p : f ms ps f (m:ms) (_:ps) = m ++ f ms ps f [] [] = [] f _ _ = error $ "Substitution failed into pattern " ++ show p ++ " with " ++ show (length ms) ++ " matches, namely " ++ show ms shake-0.15.5/src/Development/Shake/FilePath.hs0000644000000000000000000000763512560222036017230 0ustar0000000000000000{-# LANGUAGE CPP #-} #ifndef MIN_VERSION_filepath #if __GLASGOW_HASKELL__ >= 709 #define MIN_VERSION_filepath(a,b,c) 1 #else #define MIN_VERSION_filepath(a,b,c) 0 #endif #endif -- | A module for 'FilePath' operations exposing "System.FilePath" plus some additional operations. -- -- /Windows note:/ The extension methods ('<.>', 'takeExtension' etc) use the Posix variants since on -- Windows @\"\/\/\*\" '<.>' \"txt\"@ produces @\"\/\/\*\\\\.txt\"@ -- (which is bad for 'Development.Shake.FilePattern' values). module Development.Shake.FilePath( module System.FilePath, module System.FilePath.Posix, dropDirectory1, takeDirectory1, normaliseEx, #if !MIN_VERSION_filepath(1,4,0) (-<.>), #endif toNative, toStandard, exe ) where import System.Info.Extra import qualified System.FilePath as Native import System.FilePath hiding (splitExtension, takeExtension, replaceExtension, dropExtension, addExtension ,hasExtension, (<.>), splitExtensions, takeExtensions, dropExtensions #if MIN_VERSION_filepath(1,4,0) ,(-<.>) #endif ) import System.FilePath.Posix (splitExtension, takeExtension, replaceExtension, dropExtension, addExtension ,hasExtension, (<.>), splitExtensions, takeExtensions, dropExtensions #if MIN_VERSION_filepath(1,4,0) ,(-<.>) #endif ) #if !MIN_VERSION_filepath(1,4,0) infixr 7 -<.> -- | Remove the current extension and add another, an alias for 'replaceExtension'. (-<.>) :: FilePath -> String -> FilePath (-<.>) = replaceExtension #endif -- | Drop the first directory from a 'FilePath'. Should only be used on -- relative paths. -- -- > dropDirectory1 "aaa/bbb" == "bbb" -- > dropDirectory1 "aaa/" == "" -- > dropDirectory1 "aaa" == "" -- > dropDirectory1 "" == "" dropDirectory1 :: FilePath -> FilePath dropDirectory1 = drop 1 . dropWhile (not . isPathSeparator) -- | Take the first component of a 'FilePath'. Should only be used on -- relative paths. -- -- > takeDirectory1 "aaa/bbb" == "aaa" -- > takeDirectory1 "aaa/" == "aaa" -- > takeDirectory1 "aaa" == "aaa" takeDirectory1 :: FilePath -> FilePath takeDirectory1 = takeWhile (not . isPathSeparator) -- | Normalise a 'FilePath', trying to do: -- -- * All 'pathSeparators' become @\/@ -- -- * @foo\/bar\/..\/baz@ becomes @foo\/baz@ -- -- * @foo\/.\/bar@ becomes @foo\/bar@ -- -- * @foo\/\/bar@ becomes @foo\/bar@ -- -- This function is not based on the normalise function from the filepath library, as that function -- is quite broken. normaliseEx :: FilePath -> FilePath normaliseEx xs | a:b:xs <- xs, isWindows && sep a && sep b = '/' : f ('/':xs) -- account for UNC paths being double // | otherwise = f xs where sep = Native.isPathSeparator f o = toNative $ deslash o $ (++"/") $ concatMap ('/':) $ reverse $ g 0 $ reverse $ split o deslash o x | x == "/" = case (pre,pos) of (True,True) -> "/" (True,False) -> "/." (False,True) -> "./" (False,False) -> "." | otherwise = (if pre then id else tail) $ (if pos then id else init) x where pre = sep $ head $ o ++ " " pos = sep $ last $ " " ++ o g i [] = replicate i ".." g i ("..":xs) = g (i+1) xs g i (".":xs) = g i xs g 0 (x:xs) = x : g 0 xs g i (x:xs) = g (i-1) xs split xs = if null ys then [] else a : split b where (a,b) = break sep $ ys ys = dropWhile sep xs -- | Convert to native path separators, namely @\\@ on Windows. toNative :: FilePath -> FilePath toNative = if isWindows then map (\x -> if x == '/' then '\\' else x) else id -- | Convert all path separators to @/@, even on Windows. toStandard :: FilePath -> FilePath toStandard = if isWindows then map (\x -> if x == '\\' then '/' else x) else id -- | The extension of executables, @\"exe\"@ on Windows and @\"\"@ otherwise. exe :: String exe = if isWindows then "exe" else "" shake-0.15.5/src/Development/Shake/FileInfo.hs0000644000000000000000000001137212560222036017220 0ustar0000000000000000{-# LANGUAGE GeneralizedNewtypeDeriving, DeriveDataTypeable, CPP, ForeignFunctionInterface #-} module Development.Shake.FileInfo( FileInfo, fileInfoEq, fileInfoNeq, FileSize, ModTime, FileHash, getFileHash, getFileInfo ) where import Control.Exception import Development.Shake.Classes import General.String import qualified Data.ByteString.Lazy as LBS import Data.Char import Data.Word import Numeric import System.IO #if __GLASGOW_HASKELL__ < 704 #define PORTABLE 1 #endif #if defined(PORTABLE) import System.IO.Error import System.Directory import Data.Time import System.Time #elif defined(mingw32_HOST_OS) import qualified Data.ByteString.Char8 as BS import Foreign import Foreign.C.Types import Foreign.C.String #else import System.IO.Error import System.Posix.Files.ByteString #endif -- A piece of file information, where 0 and 1 are special (see fileInfo* functions) newtype FileInfo a = FileInfo Word32 deriving (Typeable,Hashable,Binary,NFData) fileInfoEq, fileInfoNeq :: FileInfo a fileInfoEq = FileInfo 0 -- Equal to everything fileInfoNeq = FileInfo 1 -- Equal to nothing fileInfo :: Word32 -> FileInfo a fileInfo a = FileInfo $ if a > maxBound - 2 then a else a + 2 instance Show (FileInfo a) where show (FileInfo x) | x == 0 = "EQ" | x == 1 = "NEQ" | otherwise = "0x" ++ map toUpper (showHex (x-2) "") instance Eq (FileInfo a) where FileInfo a == FileInfo b | a == 0 || b == 0 = True | a == 1 || b == 1 = False | otherwise = a == b data FileInfoHash; type FileHash = FileInfo FileInfoHash data FileInfoMod ; type ModTime = FileInfo FileInfoMod data FileInfoSize; type FileSize = FileInfo FileInfoSize getFileHash :: BSU -> IO FileHash getFileHash x = withFile (unpackU x) ReadMode $ \h -> do s <- LBS.hGetContents h let res = fileInfo $ fromIntegral $ hash s evaluate res return res getFileInfo :: BSU -> IO (Maybe (ModTime, FileSize)) #if defined(PORTABLE) -- Portable fallback getFileInfo x = handleJust (\e -> if isDoesNotExistError e then Just () else Nothing) (const $ return Nothing) $ do let file = unpackU x time <- getModificationTime file size <- withFile file ReadMode hFileSize return $ Just (fileInfo $ extractFileTime time, fileInfo $ fromIntegral size) -- deal with difference in return type of getModificationTime between directory versions class ExtractFileTime a where extractFileTime :: a -> Word32 instance ExtractFileTime ClockTime where extractFileTime (TOD t _) = fromIntegral t instance ExtractFileTime UTCTime where extractFileTime = floor . fromRational . toRational . utctDayTime #elif defined(mingw32_HOST_OS) -- Directly against the Win32 API, twice as fast as the portable version getFileInfo x = BS.useAsCString (unpackU_ x) $ \file -> alloca_WIN32_FILE_ATTRIBUTE_DATA $ \fad -> do res <- c_getFileAttributesExA file 0 fad let peek = do mt <- peekLastWriteTimeLow fad; sz <- peekFileSizeLow fad; return $ Just (fileInfo mt, fileInfo sz) if res then peek else if requireU x then withCWString (unpackU x) $ \file -> do res <- c_getFileAttributesExW file 0 fad if res then peek else return Nothing else return Nothing foreign import stdcall unsafe "Windows.h GetFileAttributesExA" c_getFileAttributesExA :: Ptr CChar -> Int32 -> Ptr WIN32_FILE_ATTRIBUTE_DATA -> IO Bool foreign import stdcall unsafe "Windows.h GetFileAttributesExW" c_getFileAttributesExW :: Ptr CWchar -> Int32 -> Ptr WIN32_FILE_ATTRIBUTE_DATA -> IO Bool data WIN32_FILE_ATTRIBUTE_DATA alloca_WIN32_FILE_ATTRIBUTE_DATA :: (Ptr WIN32_FILE_ATTRIBUTE_DATA -> IO a) -> IO a alloca_WIN32_FILE_ATTRIBUTE_DATA act = allocaBytes size_WIN32_FILE_ATTRIBUTE_DATA act where size_WIN32_FILE_ATTRIBUTE_DATA = 36 peekLastWriteTimeLow :: Ptr WIN32_FILE_ATTRIBUTE_DATA -> IO Word32 peekLastWriteTimeLow p = peekByteOff p index_WIN32_FILE_ATTRIBUTE_DATA_ftLastWriteTime_dwLowDateTime where index_WIN32_FILE_ATTRIBUTE_DATA_ftLastWriteTime_dwLowDateTime = 20 peekFileSizeLow :: Ptr WIN32_FILE_ATTRIBUTE_DATA -> IO Word32 peekFileSizeLow p = peekByteOff p index_WIN32_FILE_ATTRIBUTE_DATA_nFileSizeLow where index_WIN32_FILE_ATTRIBUTE_DATA_nFileSizeLow = 32 #else -- Unix version getFileInfo x = handleJust (\e -> if isDoesNotExistError e then Just () else Nothing) (const $ return Nothing) $ do s <- getFileStatus $ unpackU_ x return $ Just (fileInfo $ extractFileTime s, fileInfo $ fromIntegral $ fileSize s) extractFileTime :: FileStatus -> Word32 #ifndef MIN_VERSION_unix #define MIN_VERSION_unix(a,b,c) 0 #endif #if MIN_VERSION_unix(2,6,0) extractFileTime x = ceiling $ modificationTimeHiRes x * 1e4 -- precision of 0.1ms #else extractFileTime x = fromIntegral $ fromEnum $ modificationTime x #endif #endif shake-0.15.5/src/Development/Shake/Errors.hs0000644000000000000000000001264612560222036017006 0ustar0000000000000000{-# LANGUAGE DeriveDataTypeable, PatternGuards, RecordWildCards #-} -- | Errors seen by the user module Development.Shake.Errors( ShakeException(..), errorStructured, err, errorNoRuleToBuildType, errorRuleTypeMismatch, errorIncompatibleRules, errorMultipleRulesMatch, errorRuleRecursion, errorNoApply, ) where import Data.Tuple.Extra import Control.Exception.Extra import Data.Typeable import Data.List err :: String -> a err msg = error $ "Development.Shake: Internal error, please report to Neil Mitchell (" ++ msg ++ ")" alternatives = let (*) = (,) in ["_rule_" * "oracle" ,"_Rule_" * "Oracle" ,"_key_" * "question" ,"_Key_" * "Question" ,"_result_" * "answer" ,"_Result_" * "Answer" ,"_rule/defaultRule_" * "addOracle" ,"_apply_" * "askOracle"] errorStructured :: String -> [(String, Maybe String)] -> String -> IO a errorStructured msg args hint = errorIO $ unlines $ [msg ++ ":"] ++ [" " ++ a ++ [':' | a /= ""] ++ replicate (as - length a + 2) ' ' ++ b | (a,b) <- args2] ++ [hint | hint /= ""] where as = maximum $ 0 : map (length . fst) args2 args2 = [(a,b) | (a,Just b) <- args] structured :: Bool -> String -> [(String, Maybe String)] -> String -> IO a structured alt msg args hint = errorStructured (f msg) (map (first f) args) (f hint) where f = filter (/= '_') . (if alt then g else id) g xs | (a,b):_ <- filter (\(a,b) -> a `isPrefixOf` xs) alternatives = b ++ g (drop (length a) xs) g (x:xs) = x : g xs g [] = [] errorNoRuleToBuildType :: TypeRep -> Maybe String -> Maybe TypeRep -> IO a errorNoRuleToBuildType tk k tv = structured (specialIsOracleKey tk) "Build system error - no _rule_ matches the _key_ type" [("_Key_ type", Just $ show tk) ,("_Key_ value", k) ,("_Result_ type", fmap show tv)] "Either you are missing a call to _rule/defaultRule_, or your call to _apply_ has the wrong _key_ type" errorRuleTypeMismatch :: TypeRep -> Maybe String -> TypeRep -> TypeRep -> IO a errorRuleTypeMismatch tk k tvReal tvWant = structured (specialIsOracleKey tk) "Build system error - _rule_ used at the wrong _result_ type" [("_Key_ type", Just $ show tk) ,("_Key_ value", k) ,("_Rule_ _result_ type", Just $ show tvReal) ,("Requested _result_ type", Just $ show tvWant)] "Either the function passed to _rule/defaultRule_ has the wrong _result_ type, or the result of _apply_ is used at the wrong type" errorIncompatibleRules :: TypeRep -> TypeRep -> TypeRep -> IO a errorIncompatibleRules tk tv1 tv2 = if specialIsOracleKey tk then errorDuplicateOracle tk Nothing [tv1,tv2] else errorStructured "Build system error - rule has multiple result types" [("Key type", Just $ show tk) ,("First result type", Just $ show tv1) ,("Second result type", Just $ show tv2)] "A function passed to rule/defaultRule has the wrong result type" errorMultipleRulesMatch :: TypeRep -> String -> Int -> IO a errorMultipleRulesMatch tk k count | specialIsOracleKey tk = if count == 0 then err $ "no oracle match for " ++ show tk else errorDuplicateOracle tk (Just k) [] | otherwise = errorStructured ("Build system error - key matches " ++ (if count == 0 then "no" else "multiple") ++ " rules") [("Key type",Just $ show tk) ,("Key value",Just k) ,("Rules matched",Just $ show count)] (if count == 0 then "Either add a rule that produces the above key, or stop requiring the above key" else "Modify your rules/defaultRules so only one can produce the above key") errorRuleRecursion :: Maybe TypeRep -> Maybe String -> IO a errorRuleRecursion tk k = errorStructured -- may involve both rules and oracle, so report as a rule "Build system error - recursion detected" [("Key type",fmap show tk) ,("Key value",k)] "Rules may not be recursive" errorDuplicateOracle :: TypeRep -> Maybe String -> [TypeRep] -> IO a errorDuplicateOracle tk k tvs = errorStructured "Build system error - duplicate oracles for the same question type" ([("Question type",Just $ show tk) ,("Question value",k)] ++ [("Answer type " ++ show i, Just $ show tv) | (i,tv) <- zip [1..] tvs]) "Only one call to addOracle is allowed per question type" errorNoApply :: TypeRep -> Maybe String -> String -> IO a errorNoApply tk k msg = structured (specialIsOracleKey tk) "Build system error - cannot currently call _apply_" [("Reason", Just msg) ,("_Key_ type", Just $ show tk) ,("_Key_ value", k)] "Move the _apply_ call earlier/later" -- Should be in Special, but then we get an import cycle specialIsOracleKey :: TypeRep -> Bool specialIsOracleKey t = con == "OracleQ" where con = show $ fst $ splitTyConApp t -- | Error representing all expected exceptions thrown by Shake. -- Problems when executing rules will be raising using this exception type. data ShakeException = ShakeException {shakeExceptionTarget :: String -- ^ The target that was being built when the exception occured. ,shakeExceptionStack :: [String] -- ^ The stack of targets, where the 'shakeExceptionTarget' is last. ,shakeExceptionInner :: SomeException -- ^ The underlying exception that was raised. } deriving Typeable instance Exception ShakeException instance Show ShakeException where show ShakeException{..} = unlines $ "Error when running Shake build system:" : map ("* " ++) shakeExceptionStack ++ [show shakeExceptionInner] shake-0.15.5/src/Development/Shake/Derived.hs0000644000000000000000000001550112560222036017105 0ustar0000000000000000 module Development.Shake.Derived( system', systemCwd, systemOutput, copyFile', copyFileChanged, readFile', readFileLines, writeFile', writeFileLines, writeFileChanged, withTempFile, withTempDir, getHashedShakeVersion ) where import Control.Monad.Extra import Control.Monad.IO.Class import System.Process import System.Directory import System.Exit import System.IO.Extra hiding (withTempFile, withTempDir, readFile') import Development.Shake.Core import Development.Shake.Rules.File import Development.Shake.FilePath import Development.Shake.Types import qualified Data.ByteString as BS import Data.Hashable -- | Get a checksum of a list of files, suitable for using as `shakeVersion`. -- This will trigger a rebuild when the Shake rules defined in any of the files are changed. -- For example: -- -- @ -- main = do -- ver <- 'getHashedShakeVersion' [\"Shakefile.hs\"] -- 'shakeArgs' 'shakeOptions'{'shakeVersion' = ver} ... -- @ -- -- To automatically detect the name of the current file, turn on the @TemplateHaskell@ -- extension and write @$(LitE . StringL . loc_filename \<$\> location)@. -- -- This feature can be turned off during development by passing -- the flag @--no-rule-version@ or setting 'shakeVersionIgnore' to 'True'. getHashedShakeVersion :: [FilePath] -> IO String getHashedShakeVersion files = do hashes <- mapM (fmap (hashWithSalt 0) . BS.readFile) files return $ "hash-" ++ show (hashWithSalt 0 hashes) checkExitCode :: String -> ExitCode -> Action () checkExitCode cmd ExitSuccess = return () checkExitCode cmd (ExitFailure i) = error $ "System command failed (code " ++ show i ++ "):\n" ++ cmd {-# DEPRECATED system' "Use 'command' or 'cmd'" #-} {-# DEPRECATED systemCwd "Use 'command' or 'cmd' with 'Cwd'" #-} {-# DEPRECATED systemOutput "Use 'command' or 'cmd' with 'Stdout' or 'Stderr'" #-} -- | /Deprecated:/ Please use 'command' or 'cmd' instead. -- This function will be removed in a future version. -- -- Execute a system command. This function will raise an error if the exit code is non-zero. -- Before running 'system'' make sure you 'need' any required files. system' :: FilePath -> [String] -> Action () system' path args = do let path2 = toNative path let cmd = unwords $ path2 : args v <- getVerbosity putLoud cmd res <- (if v >= Loud then quietly else id) $ traced (takeBaseName path) $ rawSystem path2 args checkExitCode cmd res -- | /Deprecated:/ Please use 'command' or 'cmd' instead, with 'Cwd'. -- This function will be removed in a future version. -- -- Execute a system command with a specified current working directory (first argument). -- This function will raise an error if the exit code is non-zero. -- Before running 'systemCwd' make sure you 'need' any required files. -- -- @ -- 'systemCwd' \"\/usr\/MyDirectory\" \"pwd\" [] -- @ systemCwd :: FilePath -> FilePath -> [String] -> Action () systemCwd cwd path args = do let path2 = toNative path let cmd = unwords $ path2 : args putLoud cmd res <- traced (takeBaseName path) $ do -- FIXME: Should I be using the non-exported System.Process.syncProcess? -- That installs/removes signal handlers. hdl <- runProcess path2 args (Just cwd) Nothing Nothing Nothing Nothing waitForProcess hdl checkExitCode cmd res -- | /Deprecated:/ Please use 'command' or 'cmd' instead, with 'Stdout' or 'Stderr'. -- This function will be removed in a future version. -- -- Execute a system command, returning @(stdout,stderr)@. -- This function will raise an error if the exit code is non-zero. -- Before running 'systemOutput' make sure you 'need' any required files. systemOutput :: FilePath -> [String] -> Action (String, String) systemOutput path args = do let path2 = toNative path let cmd = unwords $ path2 : args putLoud cmd (res,stdout,stderr) <- traced (takeBaseName path) $ readProcessWithExitCode path2 args "" checkExitCode cmd res return (stdout, stderr) -- | @copyFile' old new@ copies the existing file from @old@ to @new@. -- The @old@ file will be tracked as a dependency. copyFile' :: FilePath -> FilePath -> Action () copyFile' old new = do need [old] putLoud $ "Copying from " ++ old ++ " to " ++ new liftIO $ copyFile old new -- | @copyFile' old new@ copies the existing file from @old@ to @new@, if the contents have changed. -- The @old@ file will be tracked as a dependency. copyFileChanged :: FilePath -> FilePath -> Action () copyFileChanged old new = do need [old] eq <- liftIO $ doesFileExist new &&^ do withBinaryFile old ReadMode $ \h1 -> withBinaryFile new ReadMode $ \h2 -> liftM2 (==) (hFileSize h1) (hFileSize h2) &&^ liftM2 (==) (BS.hGetContents h1) (BS.hGetContents h2) when (not eq) $ do putLoud $ "Copying from " ++ old ++ " to " ++ new -- copyFile does a lot of clever stuff with permissions etc, so make sure we just reuse it liftIO $ copyFile old new -- | Read a file, after calling 'need'. The argument file will be tracked as a dependency. readFile' :: FilePath -> Action String readFile' x = need [x] >> liftIO (readFile x) -- | Write a file, lifted to the 'Action' monad. writeFile' :: FilePath -> String -> Action () writeFile' name x = liftIO $ writeFile name x -- | A version of 'readFile'' which also splits the result into lines. -- The argument file will be tracked as a dependency. readFileLines :: FilePath -> Action [String] readFileLines = fmap lines . readFile' -- | A version of 'writeFile'' which writes out a list of lines. writeFileLines :: FilePath -> [String] -> Action () writeFileLines name = writeFile' name . unlines -- | Write a file, but only if the contents would change. writeFileChanged :: FilePath -> String -> Action () writeFileChanged name x = liftIO $ do b <- doesFileExist name if not b then writeFile name x else do -- Cannot use ByteString here, since it has different line handling -- semantics on Windows b <- withFile name ReadMode $ \h -> do src <- hGetContents h return $! src /= x when b $ writeFile name x -- | Create a temporary file in the temporary directory. The file will be deleted -- after the action completes (provided the file is not still open). -- The 'FilePath' will not have any file extension, will exist, and will be zero bytes long. -- If you require a file with a specific name, use 'withTempDir'. withTempFile :: (FilePath -> Action a) -> Action a withTempFile act = do (file, del) <- liftIO newTempFile act file `actionFinally` del -- | Create a temporary directory inside the system temporary directory. -- The directory will be deleted after the action completes. withTempDir :: (FilePath -> Action a) -> Action a withTempDir act = do (dir,del) <- liftIO newTempDir act dir `actionFinally` del shake-0.15.5/src/Development/Shake/Demo.hs0000644000000000000000000001112412560222036016404 0ustar0000000000000000 -- | Demo tutorial, accessed with --demo module Development.Shake.Demo(demo) where import Paths_shake import Development.Shake.Command import Control.Exception.Extra import Control.Monad import Data.Char import Data.List import Data.Maybe import Data.Version(showVersion) import System.Directory import System.Exit import System.FilePath import Development.Shake.FilePath(exe) import System.IO import System.Info.Extra demo :: Bool -> IO () demo auto = do hSetBuffering stdout NoBuffering putStrLn $ "% Welcome to the Shake v" ++ showVersion version ++ " demo mode!" putStr $ "% Detecting machine configuration... " -- CONFIGURE manual <- getDataFileName "docs/manual" hasManual <- wrap $ doesDirectoryExist manual ghc <- findExecutable "ghc" gcc <- do v <- findExecutable "gcc" case v of Nothing | isWindows, Just ghc <- ghc -> do let dir = takeDirectory (takeDirectory ghc) "bin/mingw/gcc.exe" b <- wrap $ doesFileExist dir return $ if b then Just dir else Nothing _ -> return v shakeLib <- wrap $ fmap (not . null . words . fromStdout) (cmd "ghc-pkg list --simple-output shake") ninja <- findExecutable "ninja" putStrLn "done\n" let path = if isWindows then "%PATH%" else "$PATH" require (isJust ghc) $ "% You don't have 'ghc' on your " ++ path ++ ", which is required to run the demo." require (isJust gcc) $ "% You don't have 'gcc' on your " ++ path ++ ", which is required to run the demo." require shakeLib "% You don't have the 'shake' library installed with GHC, which is required to run the demo." require hasManual "% You don't have the Shake data files installed, which are required to run the demo." empty <- fmap (null . filter (not . all (== '.'))) $ getDirectoryContents "." dir <- if empty then getCurrentDirectory else do home <- getHomeDirectory dir <- getDirectoryContents home return $ home head (map ("shake-demo" ++) ("":map show [2..]) \\ dir) putStrLn $ "% The Shake demo uses an empty directory, OK to use:" putStrLn $ "% " ++ dir b <- yesNo auto require b "% Please create an empty directory to run the demo from, then run 'shake --demo' again." putStr "% Copying files... " createDirectoryIfMissing True dir forM_ ["Build.hs","main.c","constants.c","constants.h","build" <.> if isWindows then "bat" else "sh"] $ \file -> copyFile (manual file) (dir file) when (not isWindows) $ do p <- getPermissions $ dir "build.sh" setPermissions (dir "build.sh") p{executable=True} putStrLn "done" let pause = do putStr "% Press ENTER to continue: " if auto then putLine "" else getLine let execute x = do putStrLn $ "% RUNNING: " ++ x cmd (Cwd dir) Shell x :: IO () let build = if isWindows then "build" else "./build.sh" putStrLn "\n% [1/5] Building an example project with Shake." pause putStrLn $ "% RUNNING: cd " ++ dir execute build putStrLn "\n% [2/5] Running the produced example." pause execute $ "_build" "run" <.> exe putStrLn "\n% [3/5] Rebuilding an example project with Shake (nothing should change)." pause execute build putStrLn "\n% [4/5] Cleaning the build." pause execute $ build ++ " clean" putStrLn "\n% [5/5] Rebuilding with 2 threads and profiling." pause execute $ build ++ " -j2 --report --report=-" putStrLn "\n% See the profiling summary above, or look at the HTML profile report in" putStrLn $ "% " ++ dir "report.html" putStrLn "\n% Demo complete - all the examples can be run from:" putStrLn $ "% " ++ dir putStrLn "% For more info see http://shakebuild.com" when (isJust ninja) $ do putStrLn "\n% PS. Shake can also execute Ninja build files" putStrLn "% For more info see http://shakebuild.com/ninja" -- | Require the user to press @y@ before continuing. yesNo :: Bool -> IO Bool yesNo auto = do putStr $ "% [Y/N] (then ENTER): " x <- if auto then putLine "y" else fmap (map toLower) getLine if "y" `isPrefixOf` x then return True else if "n" `isPrefixOf` x then return False else yesNo auto putLine :: String -> IO String putLine x = putStrLn x >> return x -- | Replace exceptions with 'False'. wrap :: IO Bool -> IO Bool wrap act = act `catch_` const (return False) -- | Require a condition to be true, or exit with a message. require :: Bool -> String -> IO () require b msg = unless b $ putStrLn msg >> exitFailure shake-0.15.5/src/Development/Shake/Database.hs0000644000000000000000000005346512560222036017242 0ustar0000000000000000{-# LANGUAGE RecordWildCards, PatternGuards, ViewPatterns #-} {-# LANGUAGE MultiParamTypeClasses #-} {-# LANGUAGE DeriveDataTypeable, GeneralizedNewtypeDeriving #-} module Development.Shake.Database( Trace(..), Database, withDatabase, listDepends, lookupDependencies, Ops(..), build, Depends, progress, Stack, emptyStack, topStack, showStack, showTopStack, toReport, checkValid, listLive ) where import Development.Shake.Classes import General.Binary import Development.Shake.Pool import Development.Shake.Value import Development.Shake.Errors import Development.Shake.Storage import Development.Shake.Types import Development.Shake.Special import Development.Shake.Profile import Development.Shake.Monad import General.String import General.Intern as Intern import Numeric.Extra import Control.Applicative import Control.Exception import Control.Monad.Extra import Control.Concurrent.Extra import qualified Data.HashSet as Set import qualified Data.HashMap.Strict as Map import Data.IORef.Extra import Data.Maybe import Data.List import System.Time.Extra import Data.Monoid import Prelude type Map = Map.HashMap --------------------------------------------------------------------- -- UTILITY TYPES newtype Step = Step Word32 deriving (Eq,Ord,Show,Binary,NFData,Hashable,Typeable) incStep (Step i) = Step $ i + 1 --------------------------------------------------------------------- -- CALL STACK data Stack = Stack (Maybe Key) [Id] !(Set.HashSet Id) showStack :: Database -> Stack -> IO [String] showStack Database{..} (Stack _ xs _) = do status <- withLock lock $ readIORef status return $ reverse $ map (maybe "" (show . fst) . flip Map.lookup status) xs addStack :: Id -> Key -> Stack -> Stack addStack x key (Stack _ xs set) = Stack (Just key) (x:xs) (Set.insert x set) showTopStack :: Stack -> String showTopStack = maybe "" show . topStack topStack :: Stack -> Maybe Key topStack (Stack key _ _) = key checkStack :: [Id] -> Stack -> Maybe Id checkStack new (Stack _ old set) | bad:_ <- filter (`Set.member` set) new = Just bad | otherwise = Nothing emptyStack :: Stack emptyStack = Stack Nothing [] Set.empty --------------------------------------------------------------------- -- CENTRAL TYPES data Trace = Trace BS Float Float -- (message, start, end) deriving Show instance NFData Trace where rnf (Trace a b c) = rnf a `seq` rnf b `seq` rnf c -- | Invariant: The database does not have any cycles where a Key depends on itself data Database = Database {lock :: Lock ,intern :: IORef (Intern Key) ,status :: IORef (Map Id (Key, Status)) ,step :: Step ,journal :: Id -> (Key, Status {- Loaded or Missing -}) -> IO () ,diagnostic :: String -> IO () -- logging function ,assume :: Maybe Assume } data Status = Ready Result -- I have a value | Error SomeException -- I have been run and raised an error | Loaded Result -- Loaded from the database | Waiting Pending (Maybe Result) -- Currently checking if I am valid or building | Missing -- I am only here because I got into the Intern table deriving Show data Result = Result {result :: Value -- the result associated with the Key ,built :: {-# UNPACK #-} !Step -- when it was actually run ,changed :: {-# UNPACK #-} !Step -- the step for deciding if it's valid ,depends :: [[Id]] -- dependencies ,execution :: {-# UNPACK #-} !Float -- how long it took when it was last run (seconds) ,traces :: [Trace] -- a trace of the expensive operations (start/end in seconds since beginning of run) } deriving Show newtype Pending = Pending (IORef (IO ())) -- you must run this action when you finish, while holding DB lock -- after you have set the result to Error or Ready instance Show Pending where show _ = "Pending" statusType Ready{} = "Ready" statusType Error{} = "Error" statusType Loaded{} = "Loaded" statusType Waiting{} = "Waiting" statusType Missing{} = "Missing" isError Error{} = True; isError _ = False isWaiting Waiting{} = True; isWaiting _ = False isReady Ready{} = True; isReady _ = False -- All the waiting operations are only valid when isWaiting type Waiting = Status afterWaiting :: Waiting -> IO () -> IO () afterWaiting (Waiting (Pending p) _) act = modifyIORef' p (>> act) newWaiting :: Maybe Result -> IO Waiting newWaiting r = do ref <- newIORef $ return (); return $ Waiting (Pending ref) r runWaiting :: Waiting -> IO () runWaiting (Waiting (Pending p) _) = join $ readIORef p -- Wait for a set of actions to complete -- If the action returns True, the function will not be called again -- If the first argument is True, the thing is ended waitFor :: [(a, Waiting)] -> (Bool -> a -> IO Bool) -> IO () waitFor ws@(_:_) act = do todo <- newIORef $ length ws forM_ ws $ \(k,w) -> afterWaiting w $ do t <- readIORef todo when (t /= 0) $ do b <- act (t == 1) k writeIORef' todo $ if b then 0 else t - 1 getResult :: Status -> Maybe Result getResult (Ready r) = Just r getResult (Loaded r) = Just r getResult (Waiting _ r) = r getResult _ = Nothing --------------------------------------------------------------------- -- OPERATIONS newtype Depends = Depends {fromDepends :: [Id]} deriving (NFData) data Ops = Ops {stored :: Key -> IO (Maybe Value) -- ^ Given a Key, find the value stored on disk ,equal :: Key -> Value -> Value -> EqualCost -- ^ Given both Values, see if they are equal and how expensive that check was ,execute :: Stack -> Key -> Capture (Either SomeException (Value, [Depends], Seconds, [Trace])) -- ^ Given a stack and a key, either raise an exception or successfully build it } -- | Return either an exception (crash), or (how much time you spent waiting, the value) build :: Pool -> Database -> Ops -> Stack -> [Key] -> Capture (Either SomeException (Seconds,Depends,[Value])) build pool Database{..} Ops{..} stack ks continue = do join $ withLock lock $ do is <- forM ks $ \k -> do is <- readIORef intern case Intern.lookup k is of Just i -> return i Nothing -> do (is, i) <- return $ Intern.add k is writeIORef' intern is modifyIORef' status $ Map.insert i (k,Missing) return i whenJust (checkStack is stack) $ \bad -> do status <- readIORef status uncurry errorRuleRecursion $ case Map.lookup bad status of Nothing -> (Nothing, Nothing) Just (k,_) -> (Just $ typeKey k, Just $ show k) vs <- mapM (reduce stack) is let errs = [e | Error e <- vs] if all isReady vs then return $ continue $ Right (0, Depends is, [result r | Ready r <- vs]) else if not $ null errs then return $ continue $ Left $ head errs else do time <- offsetTime let done x = do case x of Left e -> addPoolPriority pool $ continue $ Left e Right v -> addPool pool $ do dur <- time; continue $ Right (dur, Depends is, v) return True waitFor (filter (isWaiting . snd) $ zip is vs) $ \finish i -> do s <- readIORef status case Map.lookup i s of Just (_, Error e) -> done $ Left e -- on error make sure we immediately kick off our parent Just (_, Ready{}) | finish -> done $ Right [result r | i <- is, let Ready r = snd $ fromJust $ Map.lookup i s] | otherwise -> return False return $ return () where (#=) :: Id -> (Key, Status) -> IO Status i #= (k,v) = do s <- readIORef status writeIORef' status $ Map.insert i (k,v) s diagnostic $ maybe "Missing" (statusType . snd) (Map.lookup i s) ++ " -> " ++ statusType v ++ ", " ++ maybe "" (show . fst) (Map.lookup i s) return v atom x = let s = show x in if ' ' `elem` s then "(" ++ s ++ ")" else s -- Rules for each eval* function -- * Must NOT lock -- * Must have an equal return to what is stored in the db at that point -- * Must not return Loaded reduce :: Stack -> Id -> IO Status reduce stack i = do s <- readIORef status case Map.lookup i s of Nothing -> err $ "interned value missing from database, " ++ show i Just (k, Missing) -> run stack i k Nothing Just (k, Loaded r) -> do let out b = diagnostic $ "valid " ++ show b ++ " for " ++ atom k ++ " " ++ atom (result r) let continue r = out True >> check stack i k r (depends r) let rebuild = out False >> run stack i k (Just r) case assume of Just AssumeDirty -> rebuild Just AssumeSkip -> continue r _ -> do s <- stored k case s of Just s -> case equal k (result r) s of NotEqual -> rebuild EqualCheap -> continue r EqualExpensive -> do -- warning, have the db lock while appending (may harm performance) r <- return r{result=s} journal i (k, Loaded r) i #= (k, Loaded r) continue r _ -> rebuild Just (k, res) -> return res run :: Stack -> Id -> Key -> Maybe Result -> IO Waiting run stack i k r = do w <- newWaiting r addPool pool $ do let reply res = do ans <- withLock lock $ do ans <- i #= (k, res) runWaiting w return ans case ans of Ready r -> do diagnostic $ "result " ++ atom k ++ " = "++ atom (result r) ++ " " ++ (if built r == changed r then "(changed)" else "(unchanged)") journal i (k, Loaded r) -- leave the DB lock before appending Error _ -> do diagnostic $ "result " ++ atom k ++ " = error" journal i (k, Missing) _ -> return () let norm = execute (addStack i k stack) k $ \res -> reply $ case res of Left err -> Error err Right (v,deps,(doubleToFloat -> execution),traces) -> let c | Just r <- r, equal k (result r) v /= NotEqual = changed r | otherwise = step in Ready Result{result=v,changed=c,built=step,depends=map fromDepends deps,..} case r of Just r | assume == Just AssumeClean -> do v <- stored k case v of Just v -> reply $ Ready r{result=v} Nothing -> norm _ -> norm i #= (k, w) check :: Stack -> Id -> Key -> Result -> [[Id]] -> IO Status check stack i k r [] = i #= (k, Ready r) check stack i k r (ds:rest) = do vs <- mapM (reduce (addStack i k stack)) ds let ws = filter (isWaiting . snd) $ zip ds vs if any isError vs || any (> built r) [changed | Ready Result{..} <- vs] then run stack i k $ Just r else if null ws then check stack i k r rest else do self <- newWaiting $ Just r waitFor ws $ \finish d -> do s <- readIORef status let buildIt = do b <- run stack i k $ Just r afterWaiting b $ runWaiting self return True case Map.lookup d s of Just (_, Error{}) -> buildIt Just (_, Ready r2) | changed r2 > built r -> buildIt | finish -> do res <- check stack i k r rest if not $ isWaiting res then runWaiting self else afterWaiting res $ runWaiting self return True | otherwise -> return False i #= (k, self) --------------------------------------------------------------------- -- PROGRESS progress :: Database -> IO Progress progress Database{..} = do s <- readIORef status return $ foldl' f mempty $ map snd $ Map.elems s where g = floatToDouble f s (Ready Result{..}) = if step == built then s{countBuilt = countBuilt s + 1, timeBuilt = timeBuilt s + g execution} else s{countSkipped = countSkipped s + 1, timeSkipped = timeSkipped s + g execution} f s (Loaded Result{..}) = s{countUnknown = countUnknown s + 1, timeUnknown = timeUnknown s + g execution} f s (Waiting _ r) = let (d,c) = timeTodo s t | Just Result{..} <- r = let d2 = d + g execution in d2 `seq` (d2,c) | otherwise = let c2 = c + 1 in c2 `seq` (d,c2) in s{countTodo = countTodo s + 1, timeTodo = t} f s _ = s --------------------------------------------------------------------- -- QUERY DATABASE -- | Given a map of representing a dependency order (with a show for error messages), find an ordering for the items such -- that no item points to an item before itself. -- Raise an error if you end up with a cycle. dependencyOrder :: (Eq a, Hashable a) => (a -> String) -> Map a [a] -> [a] -- Algorithm: -- Divide everyone up into those who have no dependencies [Id] -- And those who depend on a particular Id, Dep :-> Maybe [(Key,[Dep])] -- Where d :-> Just (k, ds), k depends on firstly d, then remaining on ds -- For each with no dependencies, add to list, then take its dep hole and -- promote them either to Nothing (if ds == []) or into a new slot. -- k :-> Nothing means the key has already been freed dependencyOrder shw status = f (map fst noDeps) $ Map.map Just $ Map.fromListWith (++) [(d, [(k,ds)]) | (k,d:ds) <- hasDeps] where (noDeps, hasDeps) = partition (null . snd) $ Map.toList status f [] mp | null bad = [] | otherwise = error $ unlines $ "Internal invariant broken, database seems to be cyclic" : map (" " ++) bad ++ ["... plus " ++ show (length badOverflow) ++ " more ..." | not $ null badOverflow] where (bad,badOverflow) = splitAt 10 $ [shw i | (i, Just _) <- Map.toList mp] f (x:xs) mp = x : f (now++xs) later where Just free = Map.lookupDefault (Just []) x mp (now,later) = foldl' g ([], Map.insert x Nothing mp) free g (free, mp) (k, []) = (k:free, mp) g (free, mp) (k, d:ds) = case Map.lookupDefault (Just []) d mp of Nothing -> g (free, mp) (k, ds) Just todo -> (free, Map.insert d (Just $ (k,ds) : todo) mp) -- | Eliminate all errors from the database, pretending they don't exist resultsOnly :: Map Id (Key, Status) -> Map Id (Key, Result) resultsOnly mp = Map.map (\(k, v) -> (k, let Just r = getResult v in r{depends = map (filter (isJust . flip Map.lookup keep)) $ depends r})) keep where keep = Map.filter (isJust . getResult . snd) mp removeStep :: Map Id (Key, Result) -> Map Id (Key, Result) removeStep = Map.filter (\(k,_) -> k /= stepKey) toReport :: Database -> IO [ProfileEntry] toReport Database{..} = do status <- fmap (removeStep . resultsOnly) $ readIORef status let order = let shw i = maybe "" (show . fst) $ Map.lookup i status in dependencyOrder shw $ Map.map (concat . depends . snd) status ids = Map.fromList $ zip order [0..] steps = let xs = Set.toList $ Set.fromList $ concat [[changed, built] | (_,Result{..}) <- Map.elems status] in Map.fromList $ zip (sortBy (flip compare) xs) [0..] f (k, Result{..}) = ProfileEntry {prfName = show k ,prfBuilt = fromStep built ,prfChanged = fromStep changed ,prfDepends = mapMaybe (`Map.lookup` ids) (concat depends) ,prfExecution = floatToDouble execution ,prfTraces = map fromTrace traces } where fromStep i = fromJust $ Map.lookup i steps fromTrace (Trace a b c) = ProfileTrace (unpack a) (floatToDouble b) (floatToDouble c) return [maybe (err "toReport") f $ Map.lookup i status | i <- order] checkValid :: Database -> (Key -> IO (Maybe Value)) -> (Key -> Value -> Value -> EqualCost) -> [(Key, Key)] -> IO () checkValid Database{..} stored equal missing = do status <- readIORef status intern <- readIORef intern diagnostic "Starting validity/lint checking" -- Do not use a forM here as you use too much stack space bad <- (\f -> foldM f [] (Map.toList status)) $ \seen (i,v) -> case v of (key, Ready Result{..}) -> do now <- stored key let good = maybe False ((==) EqualCheap . equal key result) now diagnostic $ "Checking if " ++ show key ++ " is " ++ show result ++ ", " ++ if good then "passed" else "FAILED" return $ [(key, result, now) | not good && not (specialAlwaysRebuilds result)] ++ seen _ -> return seen unless (null bad) $ do let n = length bad errorStructured ("Lint checking error - " ++ (if n == 1 then "value has" else show n ++ " values have") ++ " changed since being depended upon") (intercalate [("",Just "")] [ [("Key", Just $ show key),("Old", Just $ show result),("New", Just $ maybe "" show now)] | (key, result, now) <- bad]) "" bad <- return [(parent,key) | (parent, key) <- missing, isJust $ Intern.lookup key intern] unless (null bad) $ do let n = length bad errorStructured ("Lint checking error - " ++ (if n == 1 then "value" else show n ++ " values") ++ " did not have " ++ (if n == 1 then "its" else "their") ++ " creation tracked") (intercalate [("",Just "")] [ [("Rule", Just $ show parent), ("Created", Just $ show key)] | (parent,key) <- bad]) "" diagnostic "Validity/lint check passed" listLive :: Database -> IO [Key] listLive Database{..} = do diagnostic "Listing live keys" status <- readIORef status return [k | (k, Ready{}) <- Map.elems status] listDepends :: Database -> Depends -> IO [Key] listDepends Database{..} (Depends xs) = withLock lock $ do status <- readIORef status return $ map (fst . fromJust . flip Map.lookup status) xs lookupDependencies :: Database -> Key -> IO [Key] lookupDependencies Database{..} k = do withLock lock $ do intern <- readIORef intern status <- readIORef status let Just i = Intern.lookup k intern let Just (_, Ready r) = Map.lookup i status return $ map (fst . fromJust . flip Map.lookup status) $ concat $ depends r --------------------------------------------------------------------- -- STORAGE -- To simplify journaling etc we smuggle the Step in the database, with a special StepKey newtype StepKey = StepKey () deriving (Show,Eq,Typeable,Hashable,Binary,NFData) stepKey :: Key stepKey = newKey $ StepKey () toStepResult :: Step -> Result toStepResult i = Result (newValue i) i i [] 0 [] fromStepResult :: Result -> Step fromStepResult = fromValue . result withDatabase :: ShakeOptions -> (String -> IO ()) -> (Database -> IO a) -> IO a withDatabase opts diagnostic act = do registerWitness $ StepKey () registerWitness $ Step 0 witness <- currentWitness withStorage opts diagnostic witness $ \mp2 journal -> do let mp1 = Intern.fromList [(k, i) | (i, (k,_)) <- Map.toList mp2] (mp1, stepId) <- case Intern.lookup stepKey mp1 of Just stepId -> return (mp1, stepId) Nothing -> do (mp1, stepId) <- return $ Intern.add stepKey mp1 return (mp1, stepId) intern <- newIORef mp1 status <- newIORef mp2 let step = case Map.lookup stepId mp2 of Just (_, Loaded r) -> incStep $ fromStepResult r _ -> Step 1 journal stepId (stepKey, Loaded $ toStepResult step) lock <- newLock act Database{assume=shakeAssume opts,..} instance BinaryWith Witness Result where putWith ws (Result x1 x2 x3 x4 x5 x6) = putWith ws x1 >> put x2 >> put x3 >> put (BinList $ map BinList x4) >> put (BinFloat x5) >> put (BinList x6) getWith ws = (\x1 x2 x3 (BinList x4) (BinFloat x5) (BinList x6) -> Result x1 x2 x3 (map fromBinList x4) x5 x6) <$> getWith ws <*> get <*> get <*> get <*> get <*> get instance Binary Trace where put (Trace a b c) = put a >> put (BinFloat b) >> put (BinFloat c) get = (\a (BinFloat b) (BinFloat c) -> Trace a b c) <$> get <*> get <*> get instance BinaryWith Witness Status where putWith ctx Missing = putWord8 0 putWith ctx (Loaded x) = putWord8 1 >> putWith ctx x putWith ctx x = err $ "putWith, Cannot write Status with constructor " ++ statusType x getWith ctx = do i <- getWord8; if i == 0 then return Missing else fmap Loaded $ getWith ctx shake-0.15.5/src/Development/Shake/Core.hs0000644000000000000000000011645012560222036016420 0ustar0000000000000000{-# LANGUAGE RecordWildCards, GeneralizedNewtypeDeriving, ScopedTypeVariables, PatternGuards #-} {-# LANGUAGE ExistentialQuantification, MultiParamTypeClasses #-} {-# LANGUAGE CPP #-} #if __GLASGOW_HASKELL__ >= 704 {-# LANGUAGE ConstraintKinds #-} #endif module Development.Shake.Core( run, #if __GLASGOW_HASKELL__ >= 704 ShakeValue, #endif Rule(..), Rules, rule, action, withoutActions, alternatives, priority, Action, actionOnException, actionFinally, apply, apply1, traced, getShakeOptions, trackUse, trackChange, trackAllow, getVerbosity, putLoud, putNormal, putQuiet, withVerbosity, quietly, Resource, newResource, newResourceIO, withResource, withResources, newThrottle, newThrottleIO, newCache, newCacheIO, unsafeExtraThread, -- Internal stuff rulesIO, runAfter, unsafeIgnoreDependencies, ) where import Control.Exception.Extra import Control.Applicative import Data.Tuple.Extra import Control.Concurrent.Extra import Control.Monad.Extra import Control.Monad.IO.Class import Control.Monad.Trans.Writer.Strict import Data.Typeable import Data.Function import Data.Either.Extra import Numeric.Extra import Data.List import qualified Data.HashMap.Strict as Map import Data.Maybe import Data.IORef import System.Directory import System.IO.Extra import System.Time.Extra import Data.Monoid import System.IO.Unsafe import Development.Shake.Classes import Development.Shake.Pool import Development.Shake.Database import Development.Shake.Monad import Development.Shake.Resource import Development.Shake.Value import Development.Shake.Profile import Development.Shake.Types import Development.Shake.Errors import Development.Shake.Special import General.Timing import General.Extra import General.Concurrent import General.Cleanup import General.String import Prelude --------------------------------------------------------------------- -- RULES #if __GLASGOW_HASKELL__ >= 704 -- | Define an alias for the six type classes required for things involved in Shake 'Development.Shake.Rule's. -- This alias is only available in GHC 7.4 and above, and requires the @ConstraintKinds@ extension. -- -- To define your own values meeting the necessary constraints it is convenient to use the extensions -- @GeneralizedNewtypeDeriving@ and @DeriveDataTypeable@ to write: -- -- > newtype MyType = MyType (String, Bool) deriving (Show,Typeable,Eq,Hashable,Binary,NFData) type ShakeValue a = (Show a, Typeable a, Eq a, Hashable a, Binary a, NFData a) #endif -- | Define a pair of types that can be used by Shake rules. -- To import all the type classes required see "Development.Shake.Classes". -- -- A 'Rule' instance for a class of artifacts (e.g. /files/) provides: -- -- * How to identify individual artifacts, given by the @key@ type, e.g. with file names. -- -- * How to describe the state of an artifact, given by the @value@ type, e.g. the file modification time. -- -- * A way to compare two states of the same individual artifact, with 'equalValue' returning either -- 'EqualCheap' or 'NotEqual'. -- -- * A way to query the current state of an artifact, with 'storedValue' returning the current state, -- or 'Nothing' if there is no current state (e.g. the file does not exist). -- -- Checking if an artifact needs to be built consists of comparing two @value@s -- of the same @key@ with 'equalValue'. The first value is obtained by applying -- 'storedValue' to the @key@ and the second is the value stored in the build -- database after the last successful build. -- -- As an example, below is a simplified rule for building files, where files are identified -- by a 'FilePath' and their state is identified by a hash of their contents -- (the builtin functions 'Development.Shake.need' and 'Development.Shake.%>' -- provide a similar rule). -- -- @ --newtype File = File FilePath deriving (Show, Typeable, Eq, Hashable, Binary, NFData) --newtype Modtime = Modtime Double deriving (Show, Typeable, Eq, Hashable, Binary, NFData) --getFileModtime file = ... -- --instance Rule File Modtime where -- storedValue _ (File x) = do -- exists <- System.Directory.doesFileExist x -- if exists then Just \<$\> getFileModtime x else return Nothing -- equalValue _ _ t1 t2 = -- if t1 == t2 then EqualCheap else NotEqual -- @ -- -- This example instance means: -- -- * A value of type @File@ uniquely identifies a generated file. -- -- * A value of type @Modtime@ will be used to check if a file is up-to-date. -- -- It is important to distinguish 'Rule' instances from actual /rules/. 'Rule' -- instances are one component required for the creation of rules. -- Actual /rules/ are functions from a @key@ to an 'Action'; they are -- added to 'Rules' using the 'rule' function. -- -- A rule can be created for the instance above with: -- -- @ -- -- Compile foo files; for every foo output file there must be a -- -- single input file named \"filename.foo\". -- compileFoo :: 'Rules' () -- compileFoo = 'rule' (Just . compile) -- where -- compile :: File -> 'Action' Modtime -- compile (File outputFile) = do -- -- figure out the name of the input file -- let inputFile = outputFile '<.>' \"foo\" -- 'unit' $ 'Development.Shake.cmd' \"fooCC\" inputFile outputFile -- -- return the (new) file modtime of the output file: -- getFileModtime outputFile -- @ -- -- /Note:/ In this example, the timestamps of the input files are never -- used, let alone compared to the timestamps of the ouput files. -- Dependencies between output and input files are /not/ expressed by -- 'Rule' instances. Dependencies are created automatically by 'apply'. -- -- For rules whose values are not stored externally, -- 'storedValue' should return 'Just' with a sentinel value -- and 'equalValue' should always return 'EqualCheap' for that sentinel. class ( #if __GLASGOW_HASKELL__ >= 704 ShakeValue key, ShakeValue value #else Show key, Typeable key, Eq key, Hashable key, Binary key, NFData key, Show value, Typeable value, Eq value, Hashable value, Binary value, NFData value #endif ) => Rule key value where -- | /[Required]/ Retrieve the @value@ associated with a @key@, if available. -- -- As an example for filenames/timestamps, if the file exists you should return 'Just' -- the timestamp, but otherwise return 'Nothing'. storedValue :: ShakeOptions -> key -> IO (Maybe value) -- | /[Optional]/ Equality check, with a notion of how expensive the check was. equalValue :: ShakeOptions -> key -> value -> value -> EqualCost equalValue _ _ v1 v2 = if v1 == v2 then EqualCheap else NotEqual data ARule m = forall key value . Rule key value => ARule (key -> Maybe (m value)) ruleKey :: (key -> Maybe (m value)) -> key ruleKey = err "ruleKey" ruleValue :: (key -> Maybe (m value)) -> value ruleValue = err "ruleValue" -- | Define a set of rules. Rules can be created with calls to functions such as 'Development.Shake.%>' or 'action'. Rules are combined -- with either the 'Monoid' instance, or (more commonly) the 'Monad' instance and @do@ notation. To define your own -- custom types of rule, see "Development.Shake.Rule". newtype Rules a = Rules (WriterT (SRules Action) IO a) -- All IO must be associative/commutative (e.g. creating IORef/MVars) deriving (Monad, Functor, Applicative) rulesIO :: IO a -> Rules a rulesIO = Rules . liftIO newRules :: SRules Action -> Rules () newRules = Rules . tell modifyRules :: (SRules Action -> SRules Action) -> Rules () -> Rules () modifyRules f (Rules r) = Rules $ censor f r getRules :: Rules () -> IO (SRules Action) getRules (Rules r) = execWriterT r data SRules m = SRules {actions :: [m ()] ,rules :: Map.HashMap TypeRep{-k-} (TypeRep{-k-},TypeRep{-v-},[(Double,ARule m)]) -- higher fst is higher priority } instance Monoid (SRules m) where mempty = SRules [] (Map.fromList []) mappend (SRules x1 x2) (SRules y1 y2) = SRules (x1++y1) (Map.unionWith f x2 y2) where f (k, v1, xs) (_, v2, ys) | v1 == v2 = (k, v1, xs ++ ys) | otherwise = unsafePerformIO $ errorIncompatibleRules k v1 v2 instance Monoid a => Monoid (Rules a) where mempty = return mempty mappend = liftA2 mappend -- | Add a rule to build a key, returning an appropriate 'Action'. All rules at a given priority -- must be disjoint. Rules have priority 1 by default, but can be modified with 'priority'. rule :: Rule key value => (key -> Maybe (Action value)) -> Rules () rule r = newRules mempty{rules = Map.singleton k (k, v, [(1,ARule r)])} where k = typeOf $ ruleKey r; v = typeOf $ ruleValue r -- | Change the priority of a given set of rules, where higher priorities take precedence. -- All matching rules at a given priority must be disjoint, or an error is raised. -- All builtin Shake rules have priority between 0 and 1. -- Excessive use of 'priority' is discouraged. As an example: -- -- @ -- 'priority' 4 $ \"hello.*\" %> \\out -> 'writeFile'' out \"hello.*\" -- 'priority' 8 $ \"*.txt\" %> \\out -> 'writeFile'' out \"*.txt\" -- @ -- -- In this example @hello.txt@ will match the second rule, instead of raising an error about ambiguity. priority :: Double -> Rules () -> Rules () priority i = modifyRules $ \s -> s{rules = Map.map (\(a,b,cs) -> (a,b,map (first $ const i) cs)) $ rules s} -- | Change the matching behaviour of rules so rules do not have to be disjoint, but are instead matched -- in order. Only recommended for small blocks containing a handful of rules. -- -- @ -- 'alternatives' $ do -- \"hello.*\" %> \\out -> 'writeFile'' out \"hello.*\" -- \"*.txt\" %> \\out -> 'writeFile'' out \"*.txt\" -- @ -- -- In this example @hello.txt@ will match the first rule, instead of raising an error about ambiguity. alternatives :: Rules () -> Rules () alternatives = modifyRules $ \r -> r{rules = Map.map f $ rules r} where f (k, v, []) = (k, v, []) f (k, v, xs) = let (is,rs) = unzip xs in (k, v, [(maximum is, foldl1' g rs)]) g (ARule a) (ARule b) = ARule $ \x -> a x `mplus` b2 x where b2 = fmap (fmap (fromJust . cast)) . b . fromJust . cast -- | Run an action, usually used for specifying top-level requirements. -- -- @ -- main = 'Development.Shake.shake' 'shakeOptions' $ do -- 'action' $ do -- b <- 'Development.Shake.doesFileExist' \"file.src\" -- when b $ 'Development.Shake.need' [\"file.out\"] -- @ -- -- This 'action' builds @file.out@, but only if @file.src@ exists. The 'action' -- will be run in every build execution (unless 'withoutActions' is used), so only cheap -- operations should be performed. All arguments to 'action' may be run in parallel, in any order. -- -- For the standard requirement of only 'Development.Shake.need'ing a fixed list of files in the 'action', -- see 'Development.Shake.want'. action :: Action a -> Rules () action a = newRules mempty{actions=[void a]} -- | Remove all actions specified in a set of rules, usually used for implementing -- command line specification of what to build. withoutActions :: Rules () -> Rules () withoutActions = modifyRules $ \x -> x{actions=[]} registerWitnesses :: SRules m -> IO () registerWitnesses SRules{..} = forM_ (Map.elems rules) $ \(_, _, (_,ARule r):_) -> do registerWitness $ ruleKey r registerWitness $ ruleValue r data RuleInfo m = RuleInfo {stored :: Key -> IO (Maybe Value) ,equal :: Key -> Value -> Value -> EqualCost ,execute :: Key -> m Value ,resultType :: TypeRep } createRuleinfo :: ShakeOptions -> SRules Action -> Map.HashMap TypeRep (RuleInfo Action) createRuleinfo opt SRules{..} = flip Map.map rules $ \(_,tv,rs) -> RuleInfo (stored rs) (equal rs) (execute rs) tv where stored ((_,ARule r):_) = fmap (fmap newValue) . f r . fromKey where f :: Rule key value => (key -> Maybe (m value)) -> (key -> IO (Maybe value)) f _ = storedValue opt equal ((_,ARule r):_) = \k v1 v2 -> f r (fromKey k) (fromValue v1) (fromValue v2) where f :: Rule key value => (key -> Maybe (m value)) -> key -> value -> value -> EqualCost f _ = equalValue opt execute rs = \k -> case filter (not . null) $ map (mapMaybe ($ k)) rs2 of [r]:_ -> r rs -> liftIO $ errorMultipleRulesMatch (typeKey k) (show k) (length rs) where rs2 = sets [(i, \k -> fmap (fmap newValue) $ r (fromKey k)) | (i,ARule r) <- rs] sets :: Ord a => [(a, b)] -> [[b]] -- highest to lowest sets = map (map snd) . reverse . groupBy ((==) `on` fst) . sortBy (compare `on` fst) runStored :: Map.HashMap TypeRep (RuleInfo m) -> Key -> IO (Maybe Value) runStored mp k = case Map.lookup (typeKey k) mp of Nothing -> return Nothing Just RuleInfo{..} -> stored k runEqual :: Map.HashMap TypeRep (RuleInfo m) -> Key -> Value -> Value -> EqualCost runEqual mp k v1 v2 = case Map.lookup (typeKey k) mp of Nothing -> NotEqual Just RuleInfo{..} -> equal k v1 v2 runExecute :: MonadIO m => Map.HashMap TypeRep (RuleInfo m) -> Key -> m Value runExecute mp k = let tk = typeKey k in case Map.lookup tk mp of Nothing -> liftIO $ errorNoRuleToBuildType tk (Just $ show k) Nothing Just RuleInfo{..} -> execute k --------------------------------------------------------------------- -- MAKE -- global constants of Action data Global = Global {globalDatabase :: Database ,globalPool :: Pool ,globalCleanup :: Cleanup ,globalTimestamp :: IO Seconds ,globalRules :: Map.HashMap TypeRep (RuleInfo Action) ,globalOutput :: Verbosity -> String -> IO () ,globalOptions :: ShakeOptions ,globalDiagnostic :: String -> IO () ,globalLint :: String -> IO () ,globalAfter :: IORef [IO ()] ,globalTrackAbsent :: IORef [(Key, Key)] -- in rule fst, snd must be absent } -- local variables of Action data Local = Local -- constants {localStack :: Stack -- stack scoped local variables ,localVerbosity :: Verbosity ,localBlockApply :: Maybe String -- reason to block apply, or Nothing to allow -- mutable local variables ,localDepends :: [Depends] -- built up in reverse ,localDiscount :: !Seconds ,localTraces :: [Trace] -- in reverse ,localTrackAllows :: [Key -> Bool] ,localTrackUsed :: [Key] } -- | The 'Action' monad, use 'liftIO' to raise 'IO' actions into it, and 'Development.Shake.need' to execute files. -- Action values are used by 'rule' and 'action'. The 'Action' monad tracks the dependencies of a 'Rule'. newtype Action a = Action {fromAction :: RAW Global Local a} deriving (Functor, Applicative, Monad, MonadIO) actionBoom :: Bool -> Action a -> IO b -> Action a actionBoom runOnSuccess act clean = do cleanup <- Action $ getsRO globalCleanup clean <- liftIO $ addCleanup cleanup $ void clean res <- Action $ catchRAW (fromAction act) $ \e -> liftIO (clean True) >> throwRAW e liftIO $ clean runOnSuccess return res -- | If an exception is raised by the 'Action', perform some 'IO'. actionOnException :: Action a -> IO b -> Action a actionOnException = actionBoom False -- | After an 'Action', perform some 'IO', even if there is an exception. actionFinally :: Action a -> IO b -> Action a actionFinally = actionBoom True -- | Internal main function (not exported publicly) run :: ShakeOptions -> Rules () -> IO () run opts@ShakeOptions{..} rs = (if shakeLineBuffering then lineBuffering else id) $ do opts@ShakeOptions{..} <- if shakeThreads /= 0 then return opts else do p <- getProcessorCount; return opts{shakeThreads=p} start <- offsetTime rs <- getRules rs registerWitnesses rs outputLocked <- do lock <- newLock return $ \v msg -> withLock lock $ shakeOutput v msg let diagnostic = if shakeVerbosity >= Diagnostic then outputLocked Diagnostic . ("% "++) else const $ return () let output v = outputLocked v . abbreviate shakeAbbreviations except <- newIORef (Nothing :: Maybe (String, ShakeException)) let raiseError err | not shakeStaunch = throwIO err | otherwise = do let named = abbreviate shakeAbbreviations . shakeExceptionTarget atomicModifyIORef except $ \v -> (Just $ fromMaybe (named err, err) v, ()) -- no need to print exceptions here, they get printed when they are wrapped lint <- if isNothing shakeLint then return $ const $ return () else do dir <- getCurrentDirectory return $ \msg -> do now <- getCurrentDirectory when (dir /= now) $ errorStructured "Lint checking error - current directory has changed" [("When", Just msg) ,("Wanted",Just dir) ,("Got",Just now)] "" after <- newIORef [] absent <- newIORef [] withCleanup $ \cleanup -> do _ <- addCleanup cleanup $ do when shakeTimings printTimings resetTimings -- so we don't leak memory withNumCapabilities shakeThreads $ do withDatabase opts diagnostic $ \database -> do wait <- newBarrier tid <- forkIO $ flip finally (signalBarrier wait ()) $ shakeProgress $ do failure <- fmap (fmap fst) $ readIORef except stats <- progress database return stats{isFailure=failure} addCleanup cleanup $ do killThread tid void $ timeout 1000000 $ waitBarrier wait let ruleinfo = createRuleinfo opts rs addTiming "Running rules" runPool (shakeThreads == 1) shakeThreads $ \pool -> do let s0 = Global database pool cleanup start ruleinfo output opts diagnostic lint after absent let s1 = Local emptyStack shakeVerbosity Nothing [] 0 [] [] [] forM_ (actions rs) $ \act -> do addPool pool $ runAction s0 s1 act $ \x -> case x of Left e -> raiseError =<< shakeException s0 (return ["Top-level action/want"]) e Right x -> return x maybe (return ()) (throwIO . snd) =<< readIORef except when (null $ actions rs) $ do when (shakeVerbosity >= Normal) $ output Normal "Warning: No want/action statements, nothing to do" when (isJust shakeLint) $ do addTiming "Lint checking" absent <- readIORef absent checkValid database (runStored ruleinfo) (runEqual ruleinfo) absent when (shakeVerbosity >= Loud) $ output Loud "Lint checking succeeded" when (shakeReport /= []) $ do addTiming "Profile report" report <- toReport database forM_ shakeReport $ \file -> do when (shakeVerbosity >= Normal) $ output Normal $ "Writing report to " ++ file writeProfile file report when (shakeLiveFiles /= []) $ do addTiming "Listing live" live <- listLive database let liveFiles = [show k | k <- live, specialIsFileKey $ typeKey k] forM_ shakeLiveFiles $ \file -> do when (shakeVerbosity >= Normal) $ output Normal $ "Writing live list to " ++ file (if file == "-" then putStr else writeFile file) $ unlines liveFiles sequence_ . reverse =<< readIORef after lineBuffering :: IO a -> IO a lineBuffering = withBuffering stdout LineBuffering . withBuffering stderr LineBuffering abbreviate :: [(String,String)] -> String -> String abbreviate [] = id abbreviate abbrev = f where -- order so longer appreviations are preferred ordAbbrev = sortBy (flip (compare `on` length . fst)) abbrev f [] = [] f x | (to,rest):_ <- [(to,rest) | (from,to) <- ordAbbrev, Just rest <- [stripPrefix from x]] = to ++ f rest f (x:xs) = x : f xs runAction :: Global -> Local -> Action a -> Capture (Either SomeException a) runAction g l (Action x) k = runRAW g l x k runAfter :: IO () -> Action () runAfter op = do Global{..} <- Action getRO liftIO $ atomicModifyIORef globalAfter $ \ops -> (op:ops, ()) -- | Execute a rule, returning the associated values. If possible, the rules will be run in parallel. -- This function requires that appropriate rules have been added with 'rule'. -- All @key@ values passed to 'apply' become dependencies of the 'Action'. apply :: Rule key value => [key] -> Action [value] apply = f -- Don't short-circuit [] as we still want error messages where -- We don't want the forall in the Haddock docs f :: forall key value . Rule key value => [key] -> Action [value] f ks = do let tk = typeOf (err "apply key" :: key) tv = typeOf (err "apply type" :: value) Global{..} <- Action getRO block <- Action $ getsRW localBlockApply whenJust block $ liftIO . errorNoApply tk (fmap show $ listToMaybe ks) case Map.lookup tk globalRules of Nothing -> liftIO $ errorNoRuleToBuildType tk (fmap show $ listToMaybe ks) (Just tv) Just RuleInfo{resultType=tv2} | tv /= tv2 -> liftIO $ errorRuleTypeMismatch tk (fmap show $ listToMaybe ks) tv2 tv _ -> fmap (map fromValue) $ applyKeyValue $ map newKey ks applyKeyValue :: [Key] -> Action [Value] applyKeyValue [] = return [] applyKeyValue ks = do global@Global{..} <- Action getRO let exec stack k continue = do let s = Local {localVerbosity=shakeVerbosity globalOptions, localDepends=[], localStack=stack, localBlockApply=Nothing ,localDiscount=0, localTraces=[], localTrackAllows=[], localTrackUsed=[]} let top = showTopStack stack time <- offsetTime runAction global s (do liftIO $ evaluate $ rnf k liftIO $ globalLint $ "before building " ++ top putWhen Chatty $ "# " ++ show k res <- runExecute globalRules k when (shakeLint globalOptions == Just LintTracker) trackCheckUsed Action $ fmap ((,) res) getRW) $ \x -> case x of Left e -> continue . Left . toException =<< shakeException global (showStack globalDatabase stack) e Right (res, Local{..}) -> do dur <- time globalLint $ "after building " ++ top let ans = (res, reverse localDepends, dur - localDiscount, reverse localTraces) evaluate $ rnf ans continue $ Right ans stack <- Action $ getsRW localStack (dur, dep, vs) <- Action $ captureRAW $ build globalPool globalDatabase (Ops (runStored globalRules) (runEqual globalRules) exec) stack ks Action $ modifyRW $ \s -> s{localDiscount=localDiscount s + dur, localDepends=dep : localDepends s} return vs -- | Turn a normal exception into a ShakeException, giving it a stack and printing it out if in staunch mode. -- If the exception is already a ShakeException (e.g. it's a child of ours who failed and we are rethrowing) -- then do nothing with it. shakeException :: Global -> IO [String] -> SomeException -> IO ShakeException shakeException Global{globalOptions=ShakeOptions{..},..} stk e@(SomeException inner) = case cast inner of Just e@ShakeException{} -> return e Nothing -> do stk <- stk e <- return $ ShakeException (last $ "Unknown call stack" : stk) stk e when (shakeStaunch && shakeVerbosity >= Quiet) $ globalOutput Quiet $ show e ++ "Continuing due to staunch mode" return e -- | Apply a single rule, equivalent to calling 'apply' with a singleton list. Where possible, -- use 'apply' to allow parallelism. apply1 :: Rule key value => key -> Action value apply1 = fmap head . apply . return -- | Get the initial 'ShakeOptions', these will not change during the build process. getShakeOptions :: Action ShakeOptions getShakeOptions = Action $ getsRO globalOptions -- | Write an action to the trace list, along with the start/end time of running the IO action. -- The 'Development.Shake.cmd' and 'Development.Shake.command' functions automatically call 'traced'. -- The trace list is used for profile reports (see 'shakeReport'). traced :: String -> IO a -> Action a traced msg act = do Global{..} <- Action getRO stack <- Action $ getsRW localStack start <- liftIO globalTimestamp putNormal $ "# " ++ msg ++ " (for " ++ showTopStack stack ++ ")" res <- liftIO act stop <- liftIO globalTimestamp Action $ modifyRW $ \s -> s{localTraces = Trace (pack msg) (doubleToFloat start) (doubleToFloat stop) : localTraces s} return res putWhen :: Verbosity -> String -> Action () putWhen v msg = do Global{..} <- Action getRO verb <- getVerbosity when (verb >= v) $ liftIO $ globalOutput v msg -- | Write a message to the output when the verbosity ('shakeVerbosity') is appropriate. -- The output will not be interleaved with any other Shake messages -- (other than those generated by system commands). putLoud, putNormal, putQuiet :: String -> Action () putLoud = putWhen Loud putNormal = putWhen Normal putQuiet = putWhen Quiet -- | Get the current verbosity level, originally set by 'shakeVerbosity'. If you -- want to output information to the console, you are recommended to use -- 'putLoud' \/ 'putNormal' \/ 'putQuiet', which ensures multiple messages are -- not interleaved. The verbosity can be modified locally by 'withVerbosity'. getVerbosity :: Action Verbosity getVerbosity = Action $ getsRW localVerbosity -- | Run an action with a particular verbosity level. -- Will not update the 'shakeVerbosity' returned by 'getShakeOptions' and will -- not have any impact on 'Diagnostic' tracing. withVerbosity :: Verbosity -> Action a -> Action a withVerbosity new = Action . unmodifyRW f . fromAction where f s0 = (s0{localVerbosity=new}, \s -> s{localVerbosity=localVerbosity s0}) -- | Run an action with 'Quiet' verbosity, in particular messages produced by 'traced' -- (including from 'Development.Shake.cmd' or 'Development.Shake.command') will not be printed to the screen. -- Will not update the 'shakeVerbosity' returned by 'getShakeOptions' and will -- not turn off any 'Diagnostic' tracing. quietly :: Action a -> Action a quietly = withVerbosity Quiet --------------------------------------------------------------------- -- TRACKING -- | Track that a key has been used by the action preceeding it. trackUse :: #if __GLASGOW_HASKELL__ >= 704 ShakeValue key #else (Show key, Typeable key, Eq key, Hashable key, Binary key, NFData key) #endif => key -> Action () -- One of the following must be true: -- 1) you are the one building this key (e.g. key == topStack) -- 2) you have already been used by apply, and are on the dependency list -- 3) someone explicitly gave you permission with trackAllow -- 4) at the end of the rule, a) you are now on the dependency list, and b) this key itself has no dependencies (is source file) trackUse key = do let k = newKey key Global{..} <- Action getRO l@Local{..} <- Action getRW deps <- liftIO $ concatMapM (listDepends globalDatabase) localDepends let top = topStack localStack if top == Just k then return () -- condition 1 else if k `elem` deps then return () -- condition 2 else if any ($ k) localTrackAllows then return () -- condition 3 else Action $ putRW l{localTrackUsed = k : localTrackUsed} -- condition 4 trackCheckUsed :: Action () trackCheckUsed = do Global{..} <- Action getRO Local{..} <- Action getRW liftIO $ do deps <- concatMapM (listDepends globalDatabase) localDepends -- check 3a bad <- return $ localTrackUsed \\ deps unless (null bad) $ do let n = length bad errorStructured ("Lint checking error - " ++ (if n == 1 then "value was" else show n ++ " values were") ++ " used but not depended upon") [("Used", Just $ show x) | x <- bad] "" -- check 3b bad <- flip filterM localTrackUsed $ \k -> fmap (not . null) $ lookupDependencies globalDatabase k unless (null bad) $ do let n = length bad errorStructured ("Lint checking error - " ++ (if n == 1 then "value was" else show n ++ " values were") ++ " depended upon after being used") [("Used", Just $ show x) | x <- bad] "" -- | Track that a key has been changed by the action preceeding it. trackChange :: #if __GLASGOW_HASKELL__ >= 704 ShakeValue key #else (Show key, Typeable key, Eq key, Hashable key, Binary key, NFData key) #endif => key -> Action () -- One of the following must be true: -- 1) you are the one building this key (e.g. key == topStack) -- 2) someone explicitly gave you permission with trackAllow -- 3) this file is never known to the build system, at the end it is not in the database trackChange key = do let k = newKey key Global{..} <- Action getRO Local{..} <- Action getRW liftIO $ do let top = topStack localStack if top == Just k then return () -- condition 1 else if any ($ k) localTrackAllows then return () -- condition 2 else -- condition 3 atomicModifyIORef globalTrackAbsent $ \ks -> ((fromMaybe k top, k):ks, ()) -- | Allow any matching key to violate the tracking rules. trackAllow :: #if __GLASGOW_HASKELL__ >= 704 ShakeValue key #else (Show key, Typeable key, Eq key, Hashable key, Binary key, NFData key) #endif => (key -> Bool) -> Action () trackAllow test = Action $ modifyRW $ \s -> s{localTrackAllows = f : localTrackAllows s} where -- We don't want the forall in the Haddock docs arrow1Type :: forall a b . Typeable a => (a -> b) -> TypeRep arrow1Type _ = typeOf (err "trackAllow" :: a) ty = arrow1Type test f k = typeKey k == ty && test (fromKey k) --------------------------------------------------------------------- -- RESOURCES -- | Create a finite resource, given a name (for error messages) and a quantity of the resource that exists. -- Shake will ensure that actions using the same finite resource do not execute in parallel. -- As an example, only one set of calls to the Excel API can occur at one time, therefore -- Excel is a finite resource of quantity 1. You can write: -- -- @ -- 'Development.Shake.shake' 'Development.Shake.shakeOptions'{'Development.Shake.shakeThreads'=2} $ do -- 'Development.Shake.want' [\"a.xls\",\"b.xls\"] -- excel <- 'Development.Shake.newResource' \"Excel\" 1 -- \"*.xls\" 'Development.Shake.%>' \\out -> -- 'Development.Shake.withResource' excel 1 $ -- 'Development.Shake.cmd' \"excel\" out ... -- @ -- -- Now the two calls to @excel@ will not happen in parallel. -- -- As another example, calls to compilers are usually CPU bound but calls to linkers are usually -- disk bound. Running 8 linkers will often cause an 8 CPU system to grid to a halt. We can limit -- ourselves to 4 linkers with: -- -- @ -- disk <- 'Development.Shake.newResource' \"Disk\" 4 -- 'Development.Shake.want' [show i 'Development.Shake.FilePath.<.>' \"exe\" | i <- [1..100]] -- \"*.exe\" 'Development.Shake.%>' \\out -> -- 'Development.Shake.withResource' disk 1 $ -- 'Development.Shake.cmd' \"ld -o\" [out] ... -- \"*.o\" 'Development.Shake.%>' \\out -> -- 'Development.Shake.cmd' \"cl -o\" [out] ... -- @ newResource :: String -> Int -> Rules Resource newResource name mx = rulesIO $ newResourceIO name mx -- | Create a throttled resource, given a name (for error messages) and a number of resources (the 'Int') that can be -- used per time period (the 'Double' in seconds). Shake will ensure that actions using the same throttled resource -- do not exceed the limits. As an example, let us assume that making more than 1 request every 5 seconds to -- Google results in our client being blacklisted, we can write: -- -- @ -- google <- 'Development.Shake.newThrottle' \"Google\" 1 5 -- \"*.url\" 'Development.Shake.%>' \\out -> do -- 'Development.Shake.withResource' google 1 $ -- 'Development.Shake.cmd' \"wget\" [\"http:\/\/google.com?q=\" ++ 'Development.Shake.FilePath.takeBaseName' out] \"-O\" [out] -- @ -- -- Now we will wait at least 5 seconds after querying Google before performing another query. If Google change the rules to -- allow 12 requests per minute we can instead use @'Development.Shake.newThrottle' \"Google\" 12 60@, which would allow -- greater parallelisation, and avoid throttling entirely if only a small number of requests are necessary. -- -- In the original example we never make a fresh request until 5 seconds after the previous request has /completed/. If we instead -- want to throttle requests since the previous request /started/ we can write: -- -- @ -- google <- 'Development.Shake.newThrottle' \"Google\" 1 5 -- \"*.url\" 'Development.Shake.%>' \\out -> do -- 'Development.Shake.withResource' google 1 $ return () -- 'Development.Shake.cmd' \"wget\" [\"http:\/\/google.com?q=\" ++ 'Development.Shake.FilePath.takeBaseName' out] \"-O\" [out] -- @ -- -- However, the rule may not continue running immediately after 'Development.Shake.withResource' completes, so while -- we will never exceed an average of 1 request every 5 seconds, we may end up running an unbounded number of -- requests simultaneously. If this limitation causes a problem in practice it can be fixed. newThrottle :: String -> Int -> Double -> Rules Resource newThrottle name count period = rulesIO $ newThrottleIO name count period blockApply :: String -> Action a -> Action a blockApply msg = Action . unmodifyRW f . fromAction where f s0 = (s0{localBlockApply=Just msg}, \s -> s{localBlockApply=localBlockApply s0}) -- | Run an action which uses part of a finite resource. For more details see 'Resource'. -- You cannot depend on a rule (e.g. 'need') while a resource is held. withResource :: Resource -> Int -> Action a -> Action a withResource r i act = do Global{..} <- Action getRO liftIO $ globalDiagnostic $ show r ++ " waiting to acquire " ++ show i offset <- liftIO $ offsetTime Action $ captureRAW $ \continue -> acquireResource r globalPool i $ continue $ Right () res <- Action $ tryRAW $ fromAction $ blockApply ("Within withResource using " ++ show r) $ do offset <- liftIO offset liftIO $ globalDiagnostic $ show r ++ " acquired " ++ show i ++ " in " ++ showDuration offset Action $ modifyRW $ \s -> s{localDiscount = localDiscount s + offset} act liftIO $ releaseResource r globalPool i liftIO $ globalDiagnostic $ show r ++ " released " ++ show i Action $ either throwRAW return res -- | Run an action which uses part of several finite resources. Acquires the resources in a stable -- order, to prevent deadlock. If all rules requiring more than one resource acquire those -- resources with a single call to 'withResources', resources will not deadlock. withResources :: [(Resource, Int)] -> Action a -> Action a withResources res act | (r,i):_ <- filter ((< 0) . snd) res = error $ "You cannot acquire a negative quantity of " ++ show r ++ ", requested " ++ show i | otherwise = f $ groupBy ((==) `on` fst) $ sortBy (compare `on` fst) res where f [] = act f (r:rs) = withResource (fst $ head r) (sum $ map snd r) $ f rs -- | A version of 'newCache' that runs in IO, and can be called before calling 'Development.Shake.shake'. -- Most people should use 'newCache' instead. newCacheIO :: (Eq k, Hashable k) => (k -> Action v) -> IO (k -> Action v) newCacheIO act = do var {- :: Var (Map k (Fence (Either SomeException ([Depends],v)))) -} <- newVar Map.empty return $ \key -> do join $ liftIO $ modifyVar var $ \mp -> case Map.lookup key mp of Just bar -> return $ (,) mp $ do res <- liftIO $ testFence bar res <- case res of Just res -> return res Nothing -> do pool <- Action $ getsRO globalPool Action $ captureRAW $ \k -> waitFence bar $ \v -> addPool pool $ k $ Right v case res of Left err -> Action $ throwRAW err Right (deps,v) -> do Action $ modifyRW $ \s -> s{localDepends = deps ++ localDepends s} return v Nothing -> do bar <- newFence return $ (,) (Map.insert key bar mp) $ do pre <- Action $ getsRW localDepends res <- Action $ tryRAW $ fromAction $ act key case res of Left err -> do liftIO $ signalFence bar $ Left err Action $ throwRAW err Right v -> do post <- Action $ getsRW localDepends let deps = take (length post - length pre) post liftIO $ signalFence bar $ Right (deps, v) return v -- | Given an action on a key, produce a cached version that will execute the action at most once per key. -- Using the cached result will still result include any dependencies that the action requires. -- Each call to 'newCache' creates a separate cache that is independent of all other calls to 'newCache'. -- -- This function is useful when creating files that store intermediate values, -- to avoid the overhead of repeatedly reading from disk, particularly if the file requires expensive parsing. -- As an example: -- -- @ -- digits \<- 'newCache' $ \\file -> do -- src \<- readFile\' file -- return $ length $ filter isDigit src -- \"*.digits\" 'Development.Shake.%>' \\x -> do -- v1 \<- digits ('dropExtension' x) -- v2 \<- digits ('dropExtension' x) -- 'Development.Shake.writeFile'' x $ show (v1,v2) -- @ -- -- To create the result @MyFile.txt.digits@ the file @MyFile.txt@ will be read and counted, but only at most -- once per execution. newCache :: (Eq k, Hashable k) => (k -> Action v) -> Rules (k -> Action v) newCache = rulesIO . newCacheIO -- | Run an action without counting to the thread limit, typically used for actions that execute -- on remote machines using barely any local CPU resources. Unsafe as it allows the 'shakeThreads' limit to be exceeded. -- You cannot depend on a rule (e.g. 'need') while the extra thread is executing. -- If the rule blocks (e.g. calls 'withResource') then the extra thread may be used by some other action. -- Only really suitable for calling 'cmd'/'command'. unsafeExtraThread :: Action a -> Action a unsafeExtraThread act = Action $ do global@Global{..} <- getRO stop <- liftIO $ increasePool globalPool res <- tryRAW $ fromAction $ blockApply "Within unsafeExtraThread" act liftIO stop captureRAW $ \continue -> (if isLeft res then addPoolPriority else addPool) globalPool $ continue res -- | Ignore any dependencies added by an action. unsafeIgnoreDependencies :: Action a -> Action a unsafeIgnoreDependencies act = Action $ do pre <- getsRW localDepends res <- fromAction act modifyRW $ \s -> s{localDepends=pre} return res shake-0.15.5/src/Development/Shake/Config.hs0000644000000000000000000001044312560222036016730 0ustar0000000000000000{-# LANGUAGE DeriveDataTypeable, GeneralizedNewtypeDeriving #-} -- | A module for parsing and using config files in a Shake build system. Config files -- consist of variable bindings, for example: -- -- > # This is my Config file -- > HEADERS_DIR = /path/to/dir -- > CFLAGS = -g -I${HEADERS_DIR} -- > CFLAGS = $CFLAGS -O2 -- > include extra/file.cfg -- -- This defines the variable @HEADERS_DIR@ (equal to @\/path\/to\/dir@), and -- @CFLAGS@ (equal to @-g -I\/path\/to\/dir -O2@), and also includes the configuration -- statements in the file @extra/file.cfg@. The full lexical syntax for configuration -- files is defined here: . -- -- To use the configuration file either use 'readConfigFile' to parse the configuration file -- and use the values directly, or 'usingConfigFile' and 'getConfig' to track the configuration -- values, so they become build dependencies. module Development.Shake.Config( readConfigFile, readConfigFileWithEnv, usingConfigFile, usingConfig, getConfig, getConfigKeys ) where import Development.Shake import Development.Shake.Classes import qualified Development.Ninja.Parse as Ninja import qualified Development.Ninja.Env as Ninja import qualified Data.HashMap.Strict as Map import qualified Data.ByteString.UTF8 as UTF8 import Control.Applicative import Data.Tuple.Extra import Data.List import Prelude -- | Read a config file, returning a list of the variables and their bindings. -- Config files use the Ninja lexical syntax: -- readConfigFile :: FilePath -> IO (Map.HashMap String String) readConfigFile = readConfigFileWithEnv [] -- | Read a config file with an initial environment, returning a list of the variables and their bindings. -- Config files use the Ninja lexical syntax: -- readConfigFileWithEnv :: [(String, String)] -> FilePath -> IO (Map.HashMap String String) readConfigFileWithEnv vars file = do env <- Ninja.newEnv mapM_ (uncurry (Ninja.addEnv env) . (UTF8.fromString *** UTF8.fromString)) vars Ninja.parse file env mp <- Ninja.fromEnv env return $ Map.fromList $ map (UTF8.toString *** UTF8.toString) $ Map.toList mp newtype Config = Config String deriving (Show,Typeable,Eq,Hashable,Binary,NFData) newtype ConfigKeys = ConfigKeys () deriving (Show,Typeable,Eq,Hashable,Binary,NFData) -- | Specify the file to use with 'getConfig'. usingConfigFile :: FilePath -> Rules () usingConfigFile file = do mp <- newCache $ \() -> do need [file] liftIO $ readConfigFile file addOracle $ \(Config x) -> Map.lookup x <$> mp () addOracle $ \(ConfigKeys x) -> sort . Map.keys <$> mp () return () -- | Specify the values to use with 'getConfig', generally prefer -- 'usingConfigFile' unless you also need access to the values -- of variables outside 'Action'. usingConfig :: Map.HashMap String String -> Rules () usingConfig mp = do addOracle $ \(Config x) -> return $ Map.lookup x mp return () -- | Obtain the value of a configuration variable, returns 'Nothing' to indicate the variable -- has no binding. Any build system using 'getConfig' /must/ call either 'usingConfigFile' -- or 'usingConfig'. The 'getConfig' function will introduce a dependency on the configuration -- variable (but not the whole configuration file), and if the configuration variable changes, the rule will be rerun. -- As an example: -- -- @ -- 'usingConfigFile' \"myconfiguration.cfg\" -- \"*.o\" '%>' \\out -> do -- cflags <- 'getConfig' \"CFLAGS\" -- 'cmd' \"gcc\" [out '-<.>' \"c\"] (fromMaybe \"\" cflags) -- @ getConfig :: String -> Action (Maybe String) getConfig = askOracle . Config -- | Obtain the configuration keys. -- Any build system using 'getConfigKeys' /must/ call either 'usingConfigFile' or 'usingConfig'. -- The 'getConfigKeys' function will introduce a dependency on the configuration keys -- (but not the whole configuration file), and if the configuration keys change, the rule will be rerun. -- Usually use as part of an action. -- As an example: -- -- @ -- 'usingConfigFile' \"myconfiguration.cfg\" -- 'action' $ need =<< getConfigKeys -- @ getConfigKeys :: Action [String] getConfigKeys = askOracle $ ConfigKeys () shake-0.15.5/src/Development/Shake/Command.hs0000644000000000000000000006722012560222036017106 0ustar0000000000000000{-# LANGUAGE FlexibleInstances, TypeSynonymInstances, TypeOperators, ScopedTypeVariables #-} -- | This module provides functions for calling command line programs, primarily -- 'command' and 'cmd'. As a simple example: -- -- @ -- 'command' [] \"gcc\" [\"-c\",myfile] -- @ -- -- The functions from this module are now available directly from "Development.Shake". -- You should only need to import this module if you are using the 'cmd' function in the 'IO' monad. module Development.Shake.Command( command, command_, cmd, unit, CmdArguments, Stdout(..), Stderr(..), Stdouterr(..), Exit(..), Process(..), CmdTime(..), CmdLine(..), CmdResult, CmdString, CmdOption(..), addPath, addEnv, ) where import Data.Tuple.Extra import Control.Applicative import Control.Exception.Extra import Control.Monad.Extra import Control.Monad.IO.Class import Data.Either.Extra import Data.List.Extra import Data.Maybe import System.Directory import System.Environment.Extra import System.Exit import System.IO.Extra import System.Process import System.Info.Extra import System.Time.Extra import System.IO.Unsafe(unsafeInterleaveIO) import qualified Data.ByteString as BS import qualified Data.ByteString.Lazy.Char8 as LBS import General.Process import Prelude import Development.Shake.Core import Development.Shake.FilePath import Development.Shake.Types import Development.Shake.Rules.File --------------------------------------------------------------------- -- ACTUAL EXECUTION -- | Options passed to 'command' or 'cmd' to control how processes are executed. data CmdOption = Cwd FilePath -- ^ Change the current directory in the spawned process. By default uses this processes current directory. | Env [(String,String)] -- ^ Change the environment variables in the spawned process. By default uses this processes environment. | AddEnv String String -- ^ Add an environment variable in the child process. | AddPath [String] [String] -- ^ Add some items to the prefix and suffix of the @$PATH@ variable. | Stdin String -- ^ Given as the @stdin@ of the spawned process. By default the @stdin@ is inherited. | StdinBS LBS.ByteString -- ^ Given as the @stdin@ of the spawned process. | Shell -- ^ Pass the command to the shell without escaping - any arguments will be joined with spaces. By default arguments are escaped properly. | BinaryPipes -- ^ Treat the @stdin@\/@stdout@\/@stderr@ messages as binary. By default 'String' results use text encoding and 'ByteString' results use binary encoding. | Traced String -- ^ Name to use with 'traced', or @\"\"@ for no tracing. By default traces using the name of the executable. | Timeout Double -- ^ Abort the computation after N seconds, will raise a failure exit code. Calls 'interruptProcessGroupOf' and 'terminateProcess', but may sometimes fail to abort the process and not timeout. | WithStdout Bool -- ^ Should I include the @stdout@ in the exception if the command fails? Defaults to 'False'. | WithStderr Bool -- ^ Should I include the @stderr@ in the exception if the command fails? Defaults to 'True'. | EchoStdout Bool -- ^ Should I echo the @stdout@? Defaults to 'True' unless a 'Stdout' result is required or you use 'FileStdout'. | EchoStderr Bool -- ^ Should I echo the @stderr@? Defaults to 'True' unless a 'Stderr' result is required or you use 'FileStderr'. | FileStdout FilePath -- ^ Should I put the @stdout@ to a file. | FileStderr FilePath -- ^ Should I put the @stderr@ to a file. deriving (Eq,Ord,Show) -- | /Deprecated:/ Use 'AddPath'. This function will be removed in a future version. -- -- Add a prefix and suffix to the @$PATH@ environment variable. For example: -- -- @ -- opt <- 'addPath' [\"\/usr\/special\"] [] -- 'cmd' opt \"userbinary --version\" -- @ -- -- Would prepend @\/usr\/special@ to the current @$PATH@, and the command would pick -- @\/usr\/special\/userbinary@, if it exists. To add other variables see 'addEnv'. addPath :: MonadIO m => [String] -> [String] -> m CmdOption addPath pre post = do args <- liftIO getEnvironment let (path,other) = partition ((== "PATH") . (if isWindows then upper else id) . fst) args return $ Env $ [("PATH",intercalate [searchPathSeparator] $ pre ++ post) | null path] ++ [(a,intercalate [searchPathSeparator] $ pre ++ [b | b /= ""] ++ post) | (a,b) <- path] ++ other -- | /Deprecated:/ Use 'AddEnv'. This function will be removed in a future version. -- -- Add a single variable to the environment. For example: -- -- @ -- opt <- 'addEnv' [(\"CFLAGS\",\"-O2\")] -- 'cmd' opt \"gcc -c main.c\" -- @ -- -- Would add the environment variable @$CFLAGS@ with value @-O2@. If the variable @$CFLAGS@ -- was already defined it would be overwritten. If you wish to modify @$PATH@ see 'addPath'. addEnv :: MonadIO m => [(String, String)] -> m CmdOption addEnv extra = do args <- liftIO getEnvironment return $ Env $ extra ++ filter (\(a,b) -> a `notElem` map fst extra) args data Str = Str String | BS BS.ByteString | LBS LBS.ByteString | Unit deriving Eq data Result = ResultStdout Str | ResultStderr Str | ResultStdouterr Str | ResultCode ExitCode | ResultTime Double | ResultLine String | ResultProcess Pid deriving Eq data Pid = Pid0 | Pid ProcessHandle instance Eq Pid where _ == _ = True --------------------------------------------------------------------- -- ACTION EXPLICIT OPERATION commandExplicit :: String -> [CmdOption] -> [Result] -> String -> [String] -> Action [Result] commandExplicit funcName copts results exe args = do opts <- getShakeOptions verb <- getVerbosity let skipper act = if null results && not (shakeRunCommands opts) then return [] else act let verboser act = do let cwd = listToMaybe $ reverse [x | Cwd x <- copts] putLoud $ maybe "" (\x -> "cd " ++ x ++ "; ") cwd ++ saneCommandForUser exe args (if verb >= Loud then quietly else id) act let tracer = case reverse [x | Traced x <- copts] of "":_ -> liftIO msg:_ -> traced msg [] -> traced (takeFileName exe) let tracker act = case shakeLint opts of Just LintTracker -> (if isWindows then winTracker else unixTracker) act _ -> act [] exe args winTracker act = do (dir, cleanup) <- liftIO newTempDir flip actionFinally cleanup $ do res <- act [] "tracker" $ "/if":dir:"/c":exe:args (rs, ws) <- liftIO $ trackerFiles dir trackRead rs trackWrite ws return res unixTracker act = do (file, cleanup) <- liftIO newTempFile flip actionFinally cleanup $ do fsat <- liftIO $ getEnv "FSAT" let vars = [AddEnv "DYLD_INSERT_LIBRARIES" fsat ,AddEnv "DYLD_FORCE_FLAT_NAMESPACE" "1" ,AddEnv "FSAT_OUT" file] res <- act vars exe args (rs, ws) <- liftIO $ fsatraceFiles file whitelist <- liftIO unixWhitelist let whitelisted x = any (\w -> (w ++ "/") `isPrefixOf` x) whitelist trackRead $ filter (not . whitelisted) rs trackWrite $ filter (not . whitelisted) ws return res skipper $ tracker $ \opts exe args -> verboser $ tracer $ commandExplicitIO funcName (opts++copts) results exe args -- | Given a directory (as passed to tracker /if) report on which files were used for reading/writing trackerFiles :: FilePath -> IO ([FilePath], [FilePath]) trackerFiles dir = do curdir <- getCurrentDirectory let pre = upper curdir ++ "\\" files <- getDirectoryContents dir let f typ = do files <- forM [x | x <- files, takeExtension x == ".tlog", takeExtension (dropExtension $ dropExtension x) == '.':typ] $ \file -> do xs <- readFileEncoding utf16 $ dir file return $ filter (not . isPrefixOf "." . takeFileName) . mapMaybe (stripPrefix pre) $ lines xs fmap nubOrd $ mapMaybeM correctCase $ nubOrd $ concat files liftM2 (,) (f "read") (f "write") correctCase :: FilePath -> IO (Maybe FilePath) correctCase x = f "" x where f pre "" = return $ Just pre f pre x = do let (a,b) = (takeDirectory1 x, dropDirectory1 x) dir <- getDirectoryContents pre case find ((==) a . upper) dir of Nothing -> return Nothing -- if it can't be found it probably doesn't exist, so assume a file that wasn't really read Just v -> f (pre +/+ v) b a +/+ b = if null a then b else a ++ "/" ++ b fsatraceFiles :: FilePath -> IO ([FilePath], [FilePath]) fsatraceFiles file = do xs <- parseFSAT <$> readFileUTF8 file let reader (FSATRead x) = Just x; reader _ = Nothing writer (FSATWrite x) = Just x; writer (FSATMove x y) = Just x; writer _ = Nothing frs <- liftIO $ filterM doesFileExist $ nubOrd $ map normalise $ mapMaybe reader xs fws <- liftIO $ filterM doesFileExist $ nubOrd $ map normalise $ mapMaybe writer xs return (frs, fws) data FSAT = FSATWrite FilePath | FSATRead FilePath | FSATMove FilePath FilePath | FSATDelete FilePath parseFSAT :: String -> [FSAT] -- any parse errors are skipped parseFSAT = mapMaybe (f . wordsBy (== ':')) . lines where f ["w",x] = Just $ FSATWrite x f ["r",x] = Just $ FSATRead x f ["m",x,y] = Just $ FSATMove x y f ["d",x] = Just $ FSATDelete x f _ = Nothing unixWhitelist :: IO [FilePath] unixWhitelist = do home <- getEnv "HOME" return [home ++ "/.ghc" ,home ++ "/Library/Haskell" ,home ++ "/Applications" ,home ++ "/.cabal" ,"/Applications" ,"/var/folders" ,"/usr" ,"/Library" ,"/System" ] --------------------------------------------------------------------- -- IO EXPLICIT OPERATION commandExplicitIO :: String -> [CmdOption] -> [Result] -> String -> [String] -> IO [Result] commandExplicitIO funcName opts results exe args = do let (grabStdout, grabStderr) = both or $ unzip $ for results $ \r -> case r of ResultStdout{} -> (True, False) ResultStderr{} -> (False, True) ResultStdouterr{} -> (True, True) _ -> (False, False) optEnv <- resolveEnv opts let optCwd = let x = last $ "" : [x | Cwd x <- opts] in if x == "" then Nothing else Just x let optStdin = flip mapMaybe opts $ \x -> case x of Stdin x -> Just $ Left x; StdinBS x -> Just $ Right x; _ -> Nothing let optShell = Shell `elem` opts let optBinary = BinaryPipes `elem` opts let optAsync = ResultProcess Pid0 `elem` results let optTimeout = listToMaybe $ reverse [x | Timeout x <- opts] let optWithStdout = last $ False : [x | WithStdout x <- opts] let optWithStderr = last $ True : [x | WithStderr x <- opts] let optFileStdout = [x | FileStdout x <- opts] let optFileStderr = [x | FileStderr x <- opts] let optEchoStdout = last $ (not grabStdout && null optFileStdout) : [x | EchoStdout x <- opts] let optEchoStderr = last $ (not grabStderr && null optFileStderr) : [x | EchoStderr x <- opts] let cmdline = saneCommandForUser exe args let bufLBS f = do (a,b) <- buf $ LBS LBS.empty; return (a, (\(LBS x) -> f x) <$> b) buf Str{} | optBinary = bufLBS (Str . LBS.unpack) buf Str{} = do x <- newBuffer; return ([DestString x | not optAsync], Str . concat <$> readBuffer x) buf LBS{} = do x <- newBuffer; return ([DestBytes x | not optAsync], LBS . LBS.fromChunks <$> readBuffer x) buf BS {} = bufLBS (BS . BS.concat . LBS.toChunks) buf Unit = return ([], return Unit) (dStdout, dStderr, resultBuild) :: ([[Destination]], [[Destination]], [Double -> ProcessHandle -> ExitCode -> IO Result]) <- fmap unzip3 $ forM results $ \r -> case r of ResultCode _ -> return ([], [], \dur pid ex -> return $ ResultCode ex) ResultTime _ -> return ([], [], \dur pid ex -> return $ ResultTime dur) ResultLine _ -> return ([], [], \dur pid ex -> return $ ResultLine cmdline) ResultProcess _ -> return ([], [], \dur pid ex -> return $ ResultProcess $ Pid pid) ResultStdout s -> do (a,b) <- buf s; return (a , [], \_ _ _ -> fmap ResultStdout b) ResultStderr s -> do (a,b) <- buf s; return ([], a , \_ _ _ -> fmap ResultStderr b) ResultStdouterr s -> do (a,b) <- buf s; return (a , a , \_ _ _ -> fmap ResultStdouterr b) exceptionBuffer <- newBuffer po <- resolvePath $ ProcessOpts {poCommand = if optShell then ShellCommand $ unwords $ exe:args else RawCommand exe args ,poCwd = optCwd, poEnv = optEnv, poTimeout = optTimeout ,poStdin = if optBinary || any isRight optStdin then Right $ LBS.concat $ map (either LBS.pack id) optStdin else Left $ concatMap fromLeft optStdin ,poStdout = [DestEcho | optEchoStdout] ++ map DestFile optFileStdout ++ [DestString exceptionBuffer | optWithStdout && not optAsync] ++ concat dStdout ,poStderr = [DestEcho | optEchoStderr] ++ map DestFile optFileStderr ++ [DestString exceptionBuffer | optWithStderr && not optAsync] ++ concat dStderr ,poAsync = optAsync } res <- try_ $ duration $ process po let failure extra = do cwd <- case optCwd of Nothing -> return "" Just v -> do v <- canonicalizePath v `catch_` const (return v) return $ "Current directory: " ++ v ++ "\n" fail $ "Development.Shake." ++ funcName ++ ", system command failed\n" ++ "Command: " ++ cmdline ++ "\n" ++ cwd ++ extra case res of Left err -> failure $ show err Right (dur,(pid,ex)) | ex /= ExitSuccess && ResultCode ExitSuccess `notElem` results -> do exceptionBuffer <- readBuffer exceptionBuffer let captured = ["Stderr" | optWithStderr] ++ ["Stdout" | optWithStdout] failure $ "Exit code: " ++ show (case ex of ExitFailure i -> i; _ -> 0) ++ "\n" ++ if null captured then "Stderr not captured because WithStderr False was used\n" else if null exceptionBuffer then intercalate " and " captured ++ " " ++ (if length captured == 1 then "was" else "were") ++ " empty" else intercalate " and " captured ++ ":\n" ++ unlines (dropWhile null $ lines $ concat exceptionBuffer) Right (dur,(pid,ex)) -> mapM (\f -> f dur pid ex) resultBuild resolveEnv :: [CmdOption] -> IO (Maybe [(String, String)]) resolveEnv opts | null env, null addEnv, null addPath = return Nothing | otherwise = Just . unique . tweakPath . (++ addEnv) <$> if null env then getEnvironment else return (concat env) where env = [x | Env x <- opts] addEnv = [(x,y) | AddEnv x y <- opts] addPath = [(x,y) | AddPath x y <- opts] newPath mid = intercalate [searchPathSeparator] $ concat (reverse $ map fst addPath) ++ [mid | mid /= ""] ++ concatMap snd addPath isPath x = (if isWindows then upper else id) x == "PATH" tweakPath xs | not $ any (isPath . fst) xs = ("PATH", newPath "") : xs | otherwise = map (\(a,b) -> (a, if isPath a then newPath b else b)) xs unique = reverse . nubOrdOn (if isWindows then upper . fst else fst) . reverse -- | If the user specifies a custom $PATH, and not Shell, then try and resolve their exe ourselves. -- Tricky, because on Windows it doesn't look in the $PATH first. resolvePath :: ProcessOpts -> IO ProcessOpts resolvePath po | Just e <- poEnv po , Just (_, path) <- find ((==) "PATH" . (if isWindows then upper else id) . fst) e , RawCommand prog args <- poCommand po = do let progExe = if prog == prog -<.> exe then prog else prog <.> exe -- use unsafeInterleaveIO to allow laziness to skip the queries we don't use pathOld <- unsafeInterleaveIO $ fmap (fromMaybe "") $ lookupEnv "PATH" old <- unsafeInterleaveIO $ findExecutable prog new <- unsafeInterleaveIO $ findExecutableWith (splitSearchPath path) progExe old2 <- unsafeInterleaveIO $ findExecutableWith (splitSearchPath pathOld) progExe switch <- return $ case () of _ | path == pathOld -> False -- The state I can see hasn't changed | Nothing <- new -> False -- I have nothing to offer | Nothing <- old -> True -- I failed last time, so this must be an improvement | Just old <- old, Just new <- new, equalFilePath old new -> False -- no different | Just old <- old, Just old2 <- old2, equalFilePath old old2 -> True -- I could predict last time | otherwise -> False return $ case new of Just new | switch -> po{poCommand = RawCommand new args} _ -> po resolvePath po = return po findExecutableWith :: [FilePath] -> String -> IO (Maybe FilePath) findExecutableWith path x = flip firstJustM (map ( x) path) $ \s -> ifM (doesFileExist s) (return $ Just s) (return Nothing) -- Like System.Process, but tweaked to show less escaping, -- Relies on relatively detailed internals of showCommandForUser. saneCommandForUser :: FilePath -> [String] -> String saneCommandForUser cmd args = unwords $ map f $ cmd:args where f x = if take (length y - 2) (drop 1 y) == x then x else y where y = showCommandForUser x [] --------------------------------------------------------------------- -- FIXED ARGUMENT WRAPPER -- | Collect the @stdout@ of the process. -- If used, the @stdout@ will not be echoed to the terminal, unless you include 'EchoStdout'. -- The value type may be either 'String', or either lazy or strict 'ByteString'. newtype Stdout a = Stdout {fromStdout :: a} -- | Collect the @stderr@ of the process. -- If used, the @stderr@ will not be echoed to the terminal, unless you include 'EchoStderr'. -- The value type may be either 'String', or either lazy or strict 'ByteString'. newtype Stderr a = Stderr {fromStderr :: a} -- | Collect the @stdout@ and @stderr@ of the process. -- If used, the @stderr@ and @stdout@ will not be echoed to the terminal, unless you include 'EchoStdout' and 'EchoStderr'. -- The value type may be either 'String', or either lazy or strict 'ByteString'. newtype Stdouterr a = Stdouterr {fromStdouterr :: a} -- | Collect the 'ExitCode' of the process. -- If you do not collect the exit code, any 'ExitFailure' will cause an exception. newtype Exit = Exit {fromExit :: ExitCode} -- | Collect the 'ProcessHandle' of the process. -- If you do collect the process handle, the command will run asyncronously and the call to 'cmd'/'command' -- will return as soon as the process is spawned. Any 'Stdout'\/'Stderr' captures will return empty strings. newtype Process = Process {fromProcess :: ProcessHandle} -- | Collect the time taken to execute the process. Can be used in conjunction with 'CmdLine' to -- write helper functions that print out the time of a result. -- -- @ --timer :: ('CmdResult' r, MonadIO m) => (forall r . 'CmdResult' r => m r) -> m r --timer act = do -- ('CmdTime' t, 'CmdLine' x, r) <- act -- liftIO $ putStrLn $ \"Command \" ++ x ++ \" took \" ++ show t ++ \" seconds\" -- return r -- --run :: IO () --run = timer $ 'cmd' \"ghc --version\" -- @ newtype CmdTime = CmdTime {fromCmdTime :: Double} -- | Collect the command line used for the process. This command line will be approximate - -- suitable for user diagnostics, but not for direct execution. newtype CmdLine = CmdLine {fromCmdLine :: String} class CmdString a where cmdString :: (Str, Str -> a) instance CmdString () where cmdString = (Unit, \Unit -> ()) instance CmdString String where cmdString = (Str "", \(Str x) -> x) instance CmdString BS.ByteString where cmdString = (BS BS.empty, \(BS x) -> x) instance CmdString LBS.ByteString where cmdString = (LBS LBS.empty, \(LBS x) -> x) -- | A class for specifying what results you want to collect from a process. -- Values are formed of 'Stdout', 'Stderr', 'Exit' and tuples of those. class CmdResult a where -- Return a list of results (with the right type but dummy data) -- and a function to transform a populated set of results into a value cmdResult :: ([Result], [Result] -> a) instance CmdResult Exit where cmdResult = ([ResultCode ExitSuccess], \[ResultCode x] -> Exit x) instance CmdResult ExitCode where cmdResult = ([ResultCode ExitSuccess], \[ResultCode x] -> x) instance CmdResult Process where cmdResult = ([ResultProcess Pid0], \[ResultProcess (Pid x)] -> Process x) instance CmdResult ProcessHandle where cmdResult = ([ResultProcess Pid0], \[ResultProcess (Pid x)] -> x) instance CmdResult CmdLine where cmdResult = ([ResultLine ""], \[ResultLine x] -> CmdLine x) instance CmdResult CmdTime where cmdResult = ([ResultTime 0], \[ResultTime x] -> CmdTime x) instance CmdString a => CmdResult (Stdout a) where cmdResult = let (a,b) = cmdString in ([ResultStdout a], \[ResultStdout x] -> Stdout $ b x) instance CmdString a => CmdResult (Stderr a) where cmdResult = let (a,b) = cmdString in ([ResultStderr a], \[ResultStderr x] -> Stderr $ b x) instance CmdString a => CmdResult (Stdouterr a) where cmdResult = let (a,b) = cmdString in ([ResultStdouterr a], \[ResultStdouterr x] -> Stdouterr $ b x) instance CmdResult () where cmdResult = ([], \[] -> ()) instance (CmdResult x1, CmdResult x2) => CmdResult (x1,x2) where cmdResult = (a1++a2, \rs -> let (r1,r2) = splitAt (length a1) rs in (b1 r1, b2 r2)) where (a1,b1) = cmdResult (a2,b2) = cmdResult cmdResultWith f = second (f .) cmdResult instance (CmdResult x1, CmdResult x2, CmdResult x3) => CmdResult (x1,x2,x3) where cmdResult = cmdResultWith $ \(a,(b,c)) -> (a,b,c) instance (CmdResult x1, CmdResult x2, CmdResult x3, CmdResult x4) => CmdResult (x1,x2,x3,x4) where cmdResult = cmdResultWith $ \(a,(b,c,d)) -> (a,b,c,d) instance (CmdResult x1, CmdResult x2, CmdResult x3, CmdResult x4, CmdResult x5) => CmdResult (x1,x2,x3,x4,x5) where cmdResult = cmdResultWith $ \(a,(b,c,d,e)) -> (a,b,c,d,e) -- | Execute a system command. Before running 'command' make sure you 'Development.Shake.need' any files -- that are used by the command. -- -- This function takes a list of options (often just @[]@, see 'CmdOption' for the available -- options), the name of the executable (either a full name, or a program on the @$PATH@) and -- a list of arguments. The result is often @()@, but can be a tuple containg any of 'Stdout', -- 'Stderr' and 'Exit'. Some examples: -- -- @ -- 'command_' [] \"gcc\" [\"-c\",\"myfile.c\"] -- compile a file, throwing an exception on failure -- 'Exit' c <- 'command' [] \"gcc\" [\"-c\",myfile] -- run a command, recording the exit code -- ('Exit' c, 'Stderr' err) <- 'command' [] \"gcc\" [\"-c\",\"myfile.c\"] -- run a command, recording the exit code and error output -- 'Stdout' out <- 'command' [] \"gcc\" [\"-MM\",\"myfile.c\"] -- run a command, recording the output -- 'command_' ['Cwd' \"generated\"] \"gcc\" [\"-c\",myfile] -- run a command in a directory -- @ -- -- Unless you retrieve the 'ExitCode' using 'Exit', any 'ExitFailure' will throw an error, including -- the 'Stderr' in the exception message. If you capture the 'Stdout' or 'Stderr', that stream will not be echoed to the console, -- unless you use the option 'EchoStdout' or 'EchoStderr'. -- -- If you use 'command' inside a @do@ block and do not use the result, you may get a compile-time error about being -- unable to deduce 'CmdResult'. To avoid this error, use 'command_'. -- -- By default the @stderr@ stream will be captured for use in error messages, and also echoed. To only echo -- pass @'WithStderr' 'False'@, which causes no streams to be captured by Shake, and certain programs (e.g. @gcc@) -- to detect they are running in a terminal. command :: CmdResult r => [CmdOption] -> String -> [String] -> Action r command opts x xs = fmap b $ commandExplicit "command" opts a x xs where (a,b) = cmdResult -- | A version of 'command' where you do not require any results, used to avoid errors about being unable -- to deduce 'CmdResult'. command_ :: [CmdOption] -> String -> [String] -> Action () command_ opts x xs = void $ commandExplicit "command_" opts [] x xs --------------------------------------------------------------------- -- VARIABLE ARGUMENT WRAPPER type a :-> t = a -- | Execute a system command. Before running 'cmd' make sure you 'Development.Shake.need' any files -- that are used by the command. -- -- * @String@ arguments are treated as whitespace separated arguments. -- -- * @[String]@ arguments are treated as literal arguments. -- -- * 'CmdOption' arguments are used as options. -- -- As some examples, here are some calls, and the resulting command string: -- -- @ -- 'unit' $ 'cmd' \"git log --pretty=\" \"oneline\" -- git log --pretty= oneline -- 'unit' $ 'cmd' \"git log --pretty=\" [\"oneline\"] -- git log --pretty= oneline -- 'unit' $ 'cmd' \"git log\" (\"--pretty=\" ++ \"oneline\") -- git log --pretty=oneline -- 'unit' $ 'cmd' \"git log\" (\"--pretty=\" ++ \"one line\") -- git log --pretty=one line -- 'unit' $ 'cmd' \"git log\" [\"--pretty=\" ++ \"one line\"] -- git log "--pretty=one line" -- @ -- -- More examples, including return values, see this translation of the examples given for the 'command' function: -- -- @ -- () <- 'cmd' \"gcc -c myfile.c\" -- compile a file, throwing an exception on failure -- 'unit' $ 'cmd' \"gcc -c myfile.c\" -- alternative to () <- binding. -- 'Exit' c <- 'cmd' \"gcc -c\" [myfile] -- run a command, recording the exit code -- ('Exit' c, 'Stderr' err) <- 'cmd' \"gcc -c myfile.c\" -- run a command, recording the exit code and error output -- 'Stdout' out <- 'cmd' \"gcc -MM myfile.c\" -- run a command, recording the output -- 'cmd' ('Cwd' \"generated\") \"gcc -c\" [myfile] :: 'Action' () -- run a command in a directory -- @ -- -- When passing file arguments we use @[myfile]@ so that if the @myfile@ variable contains spaces they are properly escaped. -- -- If you use 'cmd' inside a @do@ block and do not use the result, you may get a compile-time error about being -- unable to deduce 'CmdResult'. To avoid this error, bind the result to @()@, or include a type signature, or use -- the 'unit' function. -- -- The 'cmd' function can also be run in the 'IO' monad, but then 'Traced' is ignored and command lines are not echoed. -- As an example: -- -- @ -- 'cmd' ('Cwd' \"generated\") 'Shell' \"gcc -c myfile.c\" :: IO () -- @ cmd :: CmdArguments args => args :-> Action r cmd = cmdArguments [] class CmdArguments t where cmdArguments :: [Either CmdOption String] -> t instance (Arg a, CmdArguments r) => CmdArguments (a -> r) where cmdArguments xs x = cmdArguments $ xs ++ arg x instance CmdResult r => CmdArguments (Action r) where cmdArguments x = case partitionEithers x of (opts, x:xs) -> let (a,b) = cmdResult in fmap b $ commandExplicit "cmd" opts a x xs _ -> error "Error, no executable or arguments given to Development.Shake.cmd" instance CmdResult r => CmdArguments (IO r) where cmdArguments x = case partitionEithers x of (opts, x:xs) -> let (a,b) = cmdResult in fmap b $ commandExplicitIO "cmd" opts a x xs _ -> error "Error, no executable or arguments given to Development.Shake.cmd" class Arg a where arg :: a -> [Either CmdOption String] instance Arg String where arg = map Right . words instance Arg [String] where arg = map Right instance Arg CmdOption where arg = return . Left instance Arg [CmdOption] where arg = map Left instance Arg a => Arg (Maybe a) where arg = maybe [] arg shake-0.15.5/src/Development/Shake/Classes.hs0000644000000000000000000000064312560222036017121 0ustar0000000000000000 -- | This module reexports the six necessary type classes that every 'Rule' type must support. -- You can use this module to define new rules without depending on the @binary@, @deepseq@ and @hashable@ packages. module Development.Shake.Classes( Show(..), Typeable(..), Eq(..), Hashable(..), Binary(..), NFData(..) ) where import Data.Hashable import Data.Typeable import Data.Binary import Control.DeepSeq shake-0.15.5/src/Development/Shake/ByteString.hs0000644000000000000000000000513012560222036017612 0ustar0000000000000000 module Development.Shake.ByteString(parseMakefile, filepathNormalise, linesCR) where import qualified Data.ByteString.Char8 as BS import qualified System.FilePath as Native import System.Info.Extra import Data.Char import Data.List endsSlash :: BS.ByteString -> Bool endsSlash = BS.isSuffixOf (BS.singleton '\\') wordsMakefile :: BS.ByteString -> [BS.ByteString] wordsMakefile = f . BS.splitWith isSpace where f (x:xs) | BS.null x = f xs f (x:y:xs) | endsSlash x = f $ BS.concat [BS.init x, BS.singleton ' ', y] : xs f (x:xs) = x : f xs f [] = [] parseMakefile :: BS.ByteString -> [(BS.ByteString, [BS.ByteString])] parseMakefile = concatMap f . join . linesCR where join xs = case span endsSlash xs of ([], []) -> [] (xs, []) -> [BS.unwords $ map BS.init xs] ([], y:ys) -> y : join ys (xs, y:ys) -> BS.unwords (map BS.init xs ++ [y]) : join ys f x = [(a, wordsMakefile $ BS.drop 1 b) | a <- wordsMakefile a] where (a,b) = BS.break (== ':') $ BS.takeWhile (/= '#') x -- | This is a hot-spot, so optimised linesCR :: BS.ByteString -> [BS.ByteString] linesCR x = case BS.split '\n' x of x:xs | Just ('\r',x) <- unsnoc x -> x : map (\x -> case unsnoc x of Just ('\r',x) -> x; _ -> x) xs xs -> xs where -- the ByteString unsnoc was introduced in a newer version unsnoc x | BS.null x = Nothing | otherwise = Just (BS.last x, BS.init x) filepathNormalise :: BS.ByteString -> BS.ByteString filepathNormalise xs | isWindows, Just (a,xs) <- BS.uncons xs, sep a, Just (b,_) <- BS.uncons xs, sep b = '/' `BS.cons` f xs | otherwise = f xs where sep = Native.isPathSeparator f o = deslash o $ BS.concat $ (slash:) $ intersperse slash $ reverse $ (BS.empty:) $ g 0 $ reverse $ split $ o deslash o x | x == slash = case (pre,pos) of (True,True) -> slash (True,False) -> BS.pack "/." (False,True) -> BS.pack "./" (False,False) -> dot | otherwise = (if pre then id else BS.tail) $ (if pos then id else BS.init) x where pre = not (BS.null o) && sep (BS.head o) pos = not (BS.null o) && sep (BS.last o) g i [] = replicate i dotDot g i (x:xs) | BS.null x = g i xs g i (x:xs) | x == dotDot = g (i+1) xs g i (x:xs) | x == dot = g i xs g 0 (x:xs) = x : g 0 xs g i (x:xs) = g (i-1) xs split xs = BS.splitWith sep xs dotDot = BS.pack ".." dot = BS.singleton '.' slash = BS.singleton '/' shake-0.15.5/src/Development/Shake/Args.hs0000644000000000000000000004432312560222036016423 0ustar0000000000000000 -- | Command line parsing flags. module Development.Shake.Args(shakeOptDescrs, shakeArgs, shakeArgsWith) where import Paths_shake import Development.Shake.Types import Development.Shake.Core import Development.Shake.Demo import Development.Shake.Rules.File import Development.Shake.Progress import Development.Shake.Shake import General.Timing import Data.Tuple.Extra import Control.Concurrent import Control.Exception.Extra import Control.Monad import Data.Char import Data.Either import Data.List import Data.Maybe import Data.Time import Data.Version(showVersion) import System.Console.GetOpt import System.Directory import System.Environment import System.Exit import System.Time.Extra -- | Run a build system using command line arguments for configuration. -- The available flags are those from 'shakeOptDescrs', along with a few additional -- @make@ compatible flags that are not represented in 'ShakeOptions', such as @--print-directory@. -- If there are no file arguments then the 'Rules' are used directly, otherwise the file arguments -- are 'want'ed (after calling 'withoutActions'). As an example: -- -- @ -- main = 'shakeArgs' 'shakeOptions'{'shakeFiles' = \"_make\", 'shakeProgress' = 'progressSimple'} $ do -- 'phony' \"clean\" $ 'Development.Shake.removeFilesAfter' \"_make\" [\"\/\/*\"] -- 'want' [\"_make\/neil.txt\",\"_make\/emily.txt\"] -- \"_make\/*.txt\" '%>' \\out -> -- ... build action here ... -- @ -- -- This build system will default to building @neil.txt@ and @emily.txt@, while showing progress messages, -- and putting the Shake files in locations such as @_make\/.database@. Some example command line flags: -- -- * @main --no-progress@ will turn off progress messages. -- -- * @main -j6@ will build on 6 threads. -- -- * @main --help@ will display a list of supported flags. -- -- * @main clean@ will not build anything, but will remove the @_make@ directory, including the -- any 'shakeFiles'. -- -- * @main _make/henry.txt@ will not build @neil.txt@ or @emily.txt@, but will instead build @henry.txt@. shakeArgs :: ShakeOptions -> Rules () -> IO () shakeArgs opts rules = shakeArgsWith opts [] f where f _ files = return $ Just $ if null files then rules else want files >> withoutActions rules -- | A version of 'shakeArgs' with more flexible handling of command line arguments. -- The caller of 'shakeArgsWith' can add additional flags (the second argument) and chose how to convert -- the flags/arguments into rules (the third argument). Given: -- -- @ -- 'shakeArgsWith' opts flags (\\flagValues argValues -> result) -- @ -- -- * @opts@ is the initial 'ShakeOptions' value, which may have some fields overriden by command line flags. -- This argument is usually 'shakeOptions', perhaps with a few fields overriden. -- -- * @flags@ is a list of flag descriptions, which either produce a 'String' containing an error -- message (typically for flags with invalid arguments, .e.g. @'Left' \"could not parse as int\"@), or a value -- that is passed as @flagValues@. If you have no custom flags, pass @[]@. -- -- * @flagValues@ is a list of custom flags that the user supplied. If @flags == []@ then this list will -- be @[]@. -- -- * @argValues@ is a list of non-flag arguments, which are often treated as files and passed to 'want'. -- -- * @result@ should produce a 'Nothing' to indicate that no building needs to take place, or a 'Just' -- providing the rules that should be used. -- -- As an example of a build system that can use either @gcc@ or @distcc@ for compiling: -- -- @ --import System.Console.GetOpt -- --data Flags = DistCC deriving Eq --flags = [Option \"\" [\"distcc\"] (NoArg $ Right DistCC) \"Run distributed.\"] -- --main = 'shakeArgsWith' 'shakeOptions' flags $ \\flags targets -> return $ Just $ do -- if null targets then 'want' [\"result.exe\"] else 'want' targets -- let compiler = if DistCC \`elem\` flags then \"distcc\" else \"gcc\" -- \"*.o\" '%>' \\out -> do -- 'need' ... -- 'cmd' compiler ... -- ... -- @ -- -- Now you can pass @--distcc@ to use the @distcc@ compiler. shakeArgsWith :: ShakeOptions -> [OptDescr (Either String a)] -> ([a] -> [String] -> IO (Maybe (Rules ()))) -> IO () shakeArgsWith baseOpts userOptions rules = do addTiming "shakeArgsWith" args <- getArgs let (flags,files,errs) = getOpt Permute opts args (flagsError,flag1) = partitionEithers flags (self,user) = partitionEithers flag1 (flagsExtra,flagsShake) = first concat $ unzip self assumeNew = [x | AssumeNew x <- flagsExtra] assumeOld = [x | AssumeOld x <- flagsExtra] progressReplays = [x | ProgressReplay x <- flagsExtra] progressRecords = [x | ProgressRecord x <- flagsExtra] changeDirectory = listToMaybe [x | ChangeDirectory x <- flagsExtra] printDirectory = last $ False : [x | PrintDirectory x <- flagsExtra] shakeOpts = foldl' (flip ($)) baseOpts flagsShake -- error if you pass some clean and some dirty with specific flags errs <- return $ errs ++ flagsError ++ ["cannot mix " ++ a ++ " and " ++ b | a:b:_ <- [["`--assume-new'" | assumeNew/=[] ] ++ ["`--assume-old'" | assumeOld/=[] ] ++ ["explicit targets" | files/=[]]]] when (errs /= []) $ do putStr $ unlines $ map ("shake: " ++) $ filter (not . null) $ lines $ unlines errs showHelp exitFailure if Help `elem` flagsExtra then do showHelp else if Version `elem` flagsExtra then putStrLn $ "Shake build system, version " ++ showVersion version else if NumericVersion `elem` flagsExtra then putStrLn $ showVersion version else if Demo `elem` flagsExtra then demo $ shakeStaunch shakeOpts else if not $ null progressReplays then do dat <- forM progressReplays $ \file -> do src <- readFile file return (file, map read $ lines src) forM_ (if null $ shakeReport shakeOpts then ["-"] else shakeReport shakeOpts) $ \file -> do putStrLn $ "Writing report to " ++ file writeProgressReport file dat else do when (Sleep `elem` flagsExtra) $ threadDelay 1000000 start <- getCurrentTime curdir <- getCurrentDirectory let redir = case changeDirectory of Nothing -> id -- get the "html" directory so it caches with the current directory -- required only for debug code Just d -> bracket_ (getDataFileName "html" >> setCurrentDirectory d) (setCurrentDirectory curdir) shakeOpts <- if null progressRecords then return shakeOpts else do t <- offsetTime return shakeOpts{shakeProgress = \p -> do bracket (forkIO $ shakeProgress shakeOpts p) killThread $ const $ progressDisplay 1 (const $ return ()) $ do p <- p t <- t forM_ progressRecords $ \file -> do appendFile file $ show (t,p) ++ "\n" return p } (ran,res) <- redir $ do when printDirectory $ putStrLn $ "shake: In directory `" ++ curdir ++ "'" rules <- rules user files case rules of Nothing -> return (False,Right ()) Just rules -> do res <- try_ $ shake shakeOpts $ if NoBuild `elem` flagsExtra then withoutActions rules else rules return (True, res) if not ran || shakeVerbosity shakeOpts < Normal || NoTime `elem` flagsExtra then either throwIO return res else let esc code = if Color `elem` flagsExtra then escape code else id in case res of Left err -> if Exception `elem` flagsExtra then throw err else do putStrLn $ esc "31" $ show err exitFailure Right () -> do stop <- getCurrentTime let tot = diffUTCTime stop start (mins,secs) = divMod (ceiling tot) (60 :: Int) time = show mins ++ ":" ++ ['0' | secs < 10] ++ show secs putStrLn $ esc "32" $ "Build completed in " ++ time ++ "m" where opts = map (wrap Left . snd) shakeOptsEx ++ map (wrap Right) userOptions showHelp = do progName <- getProgName putStr $ unlines $ ("Usage: " ++ progName ++ " [options] [target] ...") : "Options:" : showOptDescr opts wrap :: (a -> b) -> OptDescr (Either String a) -> OptDescr (Either String b) wrap f = fmapOptDescr (either Left (Right . f)) showOptDescr :: [OptDescr a] -> [String] showOptDescr xs = concat [ if nargs <= 26 then [" " ++ args ++ replicate (28 - nargs) ' ' ++ desc] else [" " ++ args, replicate 30 ' ' ++ desc] | Option s l arg desc <- xs , let args = intercalate ", " $ map (short arg) s ++ map (long arg) l , let nargs = length args] where short NoArg{} x = "-" ++ [x] short (ReqArg _ b) x = "-" ++ [x] ++ " " ++ b short (OptArg _ b) x = "-" ++ [x] ++ "[=" ++ b ++ "]" long NoArg{} x = "--" ++ x long (ReqArg _ b) x = "--" ++ x ++ "=" ++ b long (OptArg _ b) x = "--" ++ x ++ "[=" ++ b ++ "]" fmapOptDescr :: (a -> b) -> OptDescr a -> OptDescr b fmapOptDescr f (Option a b c d) = Option a b (g c) d where g (NoArg a) = NoArg $ f a g (ReqArg a b) = ReqArg (f . a) b g (OptArg a b) = OptArg (f . a) b -- | A list of command line options that can be used to modify 'ShakeOptions'. Each option returns -- either an error message (invalid argument to the flag) or a function that changes some fields -- in 'ShakeOptions'. The command line flags are @make@ compatible where possbile, but additional -- flags have been added for the extra options Shake supports. shakeOptDescrs :: [OptDescr (Either String (ShakeOptions -> ShakeOptions))] shakeOptDescrs = [fmapOptDescr (either Left (Right . snd)) o | (True, o) <- shakeOptsEx] data Extra = ChangeDirectory FilePath | Version | NumericVersion | AssumeNew FilePath | AssumeOld FilePath | PrintDirectory Bool | Color | Help | Sleep | NoTime | Exception | NoBuild | ProgressRecord FilePath | ProgressReplay FilePath | Demo deriving Eq unescape :: String -> String unescape ('\ESC':'[':xs) = unescape $ drop 1 $ dropWhile (not . isAlpha) xs unescape (x:xs) = x : unescape xs unescape [] = [] escape :: String -> String -> String escape code x = "\ESC[" ++ code ++ "m" ++ x ++ "\ESC[0m" -- | True if it has a potential effect on ShakeOptions shakeOptsEx :: [(Bool, OptDescr (Either String ([Extra], ShakeOptions -> ShakeOptions)))] shakeOptsEx = [yes $ Option "a" ["abbrev"] (pairArg "abbrev" "FULL=SHORT" $ \a s -> s{shakeAbbreviations=shakeAbbreviations s ++ [a]}) "Use abbreviation in status messages." ,yes $ Option "B" ["always-make"] (noArg $ \s -> s{shakeAssume=Just AssumeDirty}) "Unconditionally make all targets." ,no $ Option "" ["no-build"] (NoArg $ Right ([NoBuild], id)) "Don't build anything." ,no $ Option "C" ["directory"] (ReqArg (\x -> Right ([ChangeDirectory x],id)) "DIRECTORY") "Change to DIRECTORY before doing anything." ,yes $ Option "" ["color","colour"] (NoArg $ Right ([Color], \s -> s{shakeOutput=outputColor (shakeOutput s)})) "Colorize the output." ,yes $ Option "d" ["debug"] (OptArg (\x -> Right ([], \s -> s{shakeVerbosity=Diagnostic, shakeOutput=outputDebug (shakeOutput s) x})) "FILE") "Print lots of debugging information." ,no $ Option "" ["demo"] (NoArg $ Right ([Demo], id)) "Run in demo mode." ,yes $ Option "" ["digest"] (NoArg $ Right ([], \s -> s{shakeChange=ChangeDigest})) "Files change when digest changes." ,yes $ Option "" ["digest-and"] (NoArg $ Right ([], \s -> s{shakeChange=ChangeModtimeAndDigest})) "Files change when modtime and digest change." ,yes $ Option "" ["digest-and-input"] (NoArg $ Right ([], \s -> s{shakeChange=ChangeModtimeAndDigestInput})) "Files change on modtime (and digest for inputs)." ,yes $ Option "" ["digest-or"] (NoArg $ Right ([], \s -> s{shakeChange=ChangeModtimeOrDigest})) "Files change when modtime or digest change." ,no $ Option "" ["exception"] (NoArg $ Right ([Exception], id)) "Throw exceptions directly." ,yes $ Option "" ["flush"] (intArg 1 "flush" "N" (\i s -> s{shakeFlush=Just i})) "Flush metadata every N seconds." ,yes $ Option "" ["never-flush"] (noArg $ \s -> s{shakeFlush=Nothing}) "Never explicitly flush metadata." ,no $ Option "h" ["help"] (NoArg $ Right ([Help],id)) "Print this message and exit." ,yes $ Option "j" ["jobs"] (optIntArg 0 "jobs" "N" $ \i s -> s{shakeThreads=fromMaybe 0 i}) "Allow N jobs/threads at once [default CPUs]." ,yes $ Option "k" ["keep-going"] (noArg $ \s -> s{shakeStaunch=True}) "Keep going when some targets can't be made." ,yes $ Option "l" ["lint"] (noArg $ \s -> s{shakeLint=Just LintBasic}) "Perform limited validation after the run." ,yes $ Option "" ["lint-tracker"] (noArg $ \s -> s{shakeLint=Just LintTracker}) "Use tracker.exe to do validation." ,yes $ Option "" ["no-lint"] (noArg $ \s -> s{shakeLint=Nothing}) "Turn off --lint." ,yes $ Option "" ["live"] (OptArg (\x -> Right ([], \s -> s{shakeLiveFiles=shakeLiveFiles s ++ [fromMaybe "live.txt" x]})) "FILE") "List the files that are live [to live.txt]." ,yes $ Option "m" ["metadata"] (reqArg "PREFIX" $ \x s -> s{shakeFiles=x}) "Prefix for storing metadata files." ,no $ Option "" ["numeric-version"] (NoArg $ Right ([NumericVersion],id)) "Print just the version number and exit." ,no $ Option "o" ["old-file","assume-old"] (ReqArg (\x -> Right ([AssumeOld x],id)) "FILE") "Consider FILE to be very old and don't remake it." ,yes $ Option "" ["old-all"] (noArg $ \s -> s{shakeAssume=Just AssumeClean}) "Don't remake any files." ,yes $ Option "" ["assume-skip"] (noArg $ \s -> s{shakeAssume=Just AssumeSkip}) "Don't remake any files this run." ,yes $ Option "" ["skip-commands"] (noArg $ \s -> s{shakeRunCommands=False}) "Try and avoid running external programs." ,yes $ Option "r" ["report","profile"] (OptArg (\x -> Right ([], \s -> s{shakeReport=shakeReport s ++ [fromMaybe "report.html" x]})) "FILE") "Write out profiling information [to report.html]." ,yes $ Option "" ["no-reports"] (noArg $ \s -> s{shakeReport=[]}) "Turn off --report." ,yes $ Option "" ["rule-version"] (reqArg "VERSION" $ \x s -> s{shakeVersion=x}) "Version of the build rules." ,yes $ Option "" ["no-rule-version"] (noArg $ \s -> s{shakeVersionIgnore=True}) "Ignore the build rules version." ,yes $ Option "s" ["silent"] (noArg $ \s -> s{shakeVerbosity=Silent}) "Don't print anything." ,no $ Option "" ["sleep"] (NoArg $ Right ([Sleep],id)) "Sleep for a second before building." ,yes $ Option "S" ["no-keep-going","stop"] (noArg $ \s -> s{shakeStaunch=False}) "Turns off -k." ,yes $ Option "" ["storage"] (noArg $ \s -> s{shakeStorageLog=True}) "Write a storage log." ,yes $ Option "p" ["progress"] (progress $ optIntArg 1 "progress" "N" $ \i s -> s{shakeProgress=prog $ fromMaybe 5 i}) "Show progress messages [every N secs, default 5]." ,yes $ Option "" ["no-progress"] (noArg $ \s -> s{shakeProgress=const $ return ()}) "Don't show progress messages." ,yes $ Option "q" ["quiet"] (noArg $ \s -> s{shakeVerbosity=move (shakeVerbosity s) pred}) "Don't print much." ,no $ Option "" ["no-time"] (NoArg $ Right ([NoTime],id)) "Don't print build time." ,yes $ Option "" ["timings"] (noArg $ \s -> s{shakeTimings=True}) "Print phase timings." ,yes $ Option "" ["touch"] (noArg $ \s -> s{shakeAssume=Just AssumeClean}) "Assume targets are clean." ,yes $ Option "V" ["verbose","trace"] (noArg $ \s -> s{shakeVerbosity=move (shakeVerbosity s) succ}) "Print tracing information." ,no $ Option "v" ["version"] (NoArg $ Right ([Version],id)) "Print the version number and exit." ,no $ Option "w" ["print-directory"] (NoArg $ Right ([PrintDirectory True],id)) "Print the current directory." ,no $ Option "" ["no-print-directory"] (NoArg $ Right ([PrintDirectory False],id)) "Turn off -w, even if it was turned on implicitly." ,no $ Option "W" ["what-if","new-file","assume-new"] (ReqArg (\x -> Right ([AssumeNew x],id)) "FILE") "Consider FILE to be infinitely new." ] where yes = (,) True no = (,) False move :: Verbosity -> (Int -> Int) -> Verbosity move x by = toEnum $ min (fromEnum mx) $ max (fromEnum mn) $ by $ fromEnum x where (mn,mx) = (asTypeOf minBound x, asTypeOf maxBound x) noArg f = NoArg $ Right ([], f) reqArg a f = ReqArg (\x -> Right ([], f x)) a intArg mn flag a f = flip ReqArg a $ \x -> case reads x of [(i,"")] | i >= mn -> Right ([],f i) _ -> Left $ "the `--" ++ flag ++ "' option requires a number, " ++ show mn ++ " or above" optIntArg mn flag a f = flip OptArg a $ maybe (Right ([], f Nothing)) $ \x -> case reads x of [(i,"")] | i >= mn -> Right ([],f $ Just i) _ -> Left $ "the `--" ++ flag ++ "' option requires a number, " ++ show mn ++ " or above" pairArg flag a f = flip ReqArg a $ \x -> case break (== '=') x of (a,'=':b) -> Right ([],f (a,b)) _ -> Left $ "the `--" ++ flag ++ "' option requires an = in the argument" progress (OptArg func msg) = flip OptArg msg $ \x -> case break (== '=') `fmap` x of Just ("record",file) -> Right ([ProgressRecord $ if null file then "progress.txt" else tail file], id) Just ("replay",file) -> Right ([ProgressReplay $ if null file then "progress.txt" else tail file], id) _ -> func x outputDebug output Nothing = output outputDebug output (Just file) = \v msg -> do when (v /= Diagnostic) $ output v msg appendFile file $ unescape msg outputColor output v msg = output v $ escape "34" msg prog i p = do program <- progressProgram progressDisplay i (\s -> progressTitlebar s >> program s) p shake-0.15.5/src/Development/Shake/Rules/0000755000000000000000000000000012560222036016257 5ustar0000000000000000shake-0.15.5/src/Development/Shake/Rules/Rerun.hs0000644000000000000000000000231512560222036017707 0ustar0000000000000000{-# LANGUAGE MultiParamTypeClasses, GeneralizedNewtypeDeriving, DeriveDataTypeable, ScopedTypeVariables #-} module Development.Shake.Rules.Rerun( defaultRuleRerun, alwaysRerun ) where import Development.Shake.Core import Development.Shake.Classes newtype AlwaysRerunQ = AlwaysRerunQ () deriving (Typeable,Eq,Hashable,Binary,NFData) instance Show AlwaysRerunQ where show _ = "alwaysRerun" newtype AlwaysRerunA = AlwaysRerunA () deriving (Typeable,Hashable,Binary,NFData) instance Show AlwaysRerunA where show _ = "" instance Eq AlwaysRerunA where a == b = False instance Rule AlwaysRerunQ AlwaysRerunA where storedValue _ _ = return Nothing -- | Always rerun the associated action. Useful for defining rules that query -- the environment. For example: -- -- @ -- \"ghcVersion.txt\" 'Development.Shake.%>' \\out -> do -- 'alwaysRerun' -- 'Development.Shake.Stdout' stdout <- 'Development.Shake.cmd' \"ghc --numeric-version\" -- 'Development.Shake.writeFileChanged' out stdout -- @ alwaysRerun :: Action () alwaysRerun = do AlwaysRerunA _ <- apply1 $ AlwaysRerunQ (); return () defaultRuleRerun :: Rules () defaultRuleRerun = rule $ \AlwaysRerunQ{} -> Just $ return $ AlwaysRerunA() shake-0.15.5/src/Development/Shake/Rules/OrderOnly.hs0000644000000000000000000000236012560222036020531 0ustar0000000000000000{-# LANGUAGE MultiParamTypeClasses, GeneralizedNewtypeDeriving, DeriveDataTypeable #-} module Development.Shake.Rules.OrderOnly( orderOnly, orderOnlyBS ) where import Development.Shake.Core import Development.Shake.Rules.File import qualified Data.ByteString.Char8 as BS -- | Define order-only dependencies, these are dependencies that will always -- be built before continuing, but which aren't dependencies of this action. -- Mostly useful for defining generated dependencies you think might be real dependencies. -- If they turn out to be real dependencies, you should add an explicit dependency afterwards. -- -- @ -- \"source.o\" %> \\out -> do -- 'orderOnly' [\"header.h\"] -- () <- 'cmd' \"gcc -c source.c -o source.o -MMD -MF source.m\" -- 'neededMakefileDependencies' \"source.m\" -- @ -- -- If @header.h@ is included by @source.c@ then the call to 'needMakefileDependencies' will cause -- it to be added as a real dependency. If it isn't, then the rule won't rebuild if it changes, -- and you will have lost some opportunity for parallelism. orderOnly :: [FilePath] -> Action () orderOnly = unsafeIgnoreDependencies . need orderOnlyBS :: [BS.ByteString] -> Action () orderOnlyBS = unsafeIgnoreDependencies . needBS shake-0.15.5/src/Development/Shake/Rules/Oracle.hs0000644000000000000000000001123412560222036020021 0ustar0000000000000000{-# LANGUAGE MultiParamTypeClasses, GeneralizedNewtypeDeriving, DeriveDataTypeable, ScopedTypeVariables #-} -- Allows the user to violate the functional dependency, but it has a runtime check so still safe {-# LANGUAGE UndecidableInstances #-} {-# LANGUAGE CPP #-} #if __GLASGOW_HASKELL__ >= 704 {-# LANGUAGE ConstraintKinds #-} #endif module Development.Shake.Rules.Oracle( addOracle, askOracle, askOracleWith ) where import Development.Shake.Core import Development.Shake.Classes -- Use should type names, since the names appear in the Haddock, and are too long if they are in full newtype OracleQ question = OracleQ question deriving (Show,Typeable,Eq,Hashable,Binary,NFData) newtype OracleA answer = OracleA answer deriving (Show,Typeable,Eq,Hashable,Binary,NFData) instance ( #if __GLASGOW_HASKELL__ >= 704 ShakeValue q, ShakeValue a #else Show q, Typeable q, Eq q, Hashable q, Binary q, NFData q, Show a, Typeable a, Eq a, Hashable a, Binary a, NFData a #endif ) => Rule (OracleQ q) (OracleA a) where storedValue _ _ = return Nothing -- | Add extra information which rules can depend on. -- An oracle is a function from a question type @q@, to an answer type @a@. -- As an example, we can define an oracle allowing you to depend on the current version of GHC: -- -- @ -- newtype GhcVersion = GhcVersion () deriving (Show,Typeable,Eq,Hashable,Binary,NFData) -- rules = do -- 'addOracle' $ \\(GhcVersion _) -> fmap 'Development.Shake.fromStdout' $ 'Development.Shake.cmd' \"ghc --numeric-version\" :: Action String -- ... rules ... -- @ -- -- If a rule calls @'askOracle' (GhcVersion ())@, that rule will be rerun whenever the GHC version changes. -- Some notes: -- -- * We define @GhcVersion@ with a @newtype@ around @()@, allowing the use of @GeneralizedNewtypeDeriving@. -- All the necessary type classes are exported from "Development.Shake.Classes". -- -- * Each call to 'addOracle' must use a different type of question. -- -- * Actions passed to 'addOracle' will be run in every build they are required, -- but if their value does not change they will not invalidate any rules depending on them. -- To get a similar behaviour using data stored in files, see 'Development.Shake.alwaysRerun'. -- -- * If the value returned by 'askOracle' is ignored then 'askOracleWith' may help avoid ambiguous type messages. -- Alternatively, use the result of 'addOracle', which is 'askOracle' restricted to the correct type. -- -- As a more complex example, consider tracking Haskell package versions: -- -- @ --newtype GhcPkgList = GhcPkgList () deriving (Show,Typeable,Eq,Hashable,Binary,NFData) --newtype GhcPkgVersion = GhcPkgVersion String deriving (Show,Typeable,Eq,Hashable,Binary,NFData) -- --rules = do -- getPkgList \<- 'addOracle' $ \\GhcPkgList{} -> do -- Stdout out <- 'Development.Shake.cmd' \"ghc-pkg list --simple-output\" -- return [(reverse b, reverse a) | x <- words out, let (a,_:b) = break (== \'-\') $ reverse x] -- -- -- getPkgVersion \<- 'addOracle' $ \\(GhcPkgVersion pkg) -> do -- pkgs <- getPkgList $ GhcPkgList () -- return $ lookup pkg pkgs -- -- -- \"myrule\" %> \\_ -> do -- getPkgVersion $ GhcPkgVersion \"shake\" -- ... rule using the shake version ... -- @ -- -- Using these definitions, any rule depending on the version of @shake@ -- should call @getPkgVersion $ GhcPkgVersion \"shake\"@ to rebuild when @shake@ is upgraded. addOracle :: ( #if __GLASGOW_HASKELL__ >= 704 ShakeValue q, ShakeValue a #else Show q, Typeable q, Eq q, Hashable q, Binary q, NFData q, Show a, Typeable a, Eq a, Hashable a, Binary a, NFData a #endif ) => (q -> Action a) -> Rules (q -> Action a) addOracle act = do rule $ \(OracleQ q) -> Just $ fmap OracleA $ act q return askOracle -- | Get information previously added with 'addOracle'. The question/answer types must match those provided -- to 'addOracle'. askOracle :: ( #if __GLASGOW_HASKELL__ >= 704 ShakeValue q, ShakeValue a #else Show q, Typeable q, Eq q, Hashable q, Binary q, NFData q, Show a, Typeable a, Eq a, Hashable a, Binary a, NFData a #endif ) => q -> Action a askOracle question = do OracleA answer <- apply1 $ OracleQ question; return answer -- | Get information previously added with 'addOracle'. The second argument is not used, but can -- be useful to fix the answer type, avoiding ambiguous type error messages. askOracleWith :: ( #if __GLASGOW_HASKELL__ >= 704 ShakeValue q, ShakeValue a #else Show q, Typeable q, Eq q, Hashable q, Binary q, NFData q, Show a, Typeable a, Eq a, Hashable a, Binary a, NFData a #endif ) => q -> a -> Action a askOracleWith question _ = askOracle question shake-0.15.5/src/Development/Shake/Rules/Files.hs0000644000000000000000000001505512560222036017663 0ustar0000000000000000{-# LANGUAGE MultiParamTypeClasses, GeneralizedNewtypeDeriving, DeriveDataTypeable, ScopedTypeVariables #-} module Development.Shake.Rules.Files( (&?>), (&%>) ) where import Control.Monad import Control.Monad.IO.Class import Data.Maybe import Data.List.Extra import System.Directory import Development.Shake.Core hiding (trackAllow) import General.Extra import General.String import Development.Shake.Classes import Development.Shake.Rules.File import Development.Shake.FilePattern import Development.Shake.FilePath import Development.Shake.Types import Development.Shake.ByteString infix 1 &?>, &%> newtype FilesQ = FilesQ [FileQ] deriving (Typeable,Eq,Hashable,Binary,NFData) newtype FilesA = FilesA [FileA] deriving (Typeable,Eq,Hashable,Binary,NFData) instance Show FilesA where show (FilesA xs) = unwords $ "Files" : map (drop 5 . show) xs instance Show FilesQ where show (FilesQ xs) = unwords $ map (showQuote . show) xs instance Rule FilesQ FilesA where storedValue opts (FilesQ xs) = fmap (fmap FilesA . sequence) $ mapM (storedValue opts) xs equalValue opts (FilesQ qs) (FilesA xs) (FilesA ys) | let n = length qs in n /= length xs || n /= length ys = NotEqual | otherwise = foldr and_ EqualCheap (zipWith3 (equalValue opts) qs xs ys) where and_ NotEqual x = NotEqual and_ EqualCheap x = x and_ EqualExpensive x = if x == NotEqual then NotEqual else EqualExpensive -- | Define a rule for building multiple files at the same time. -- Think of it as the AND (@&&@) equivalent of '%>'. -- As an example, a single invocation of GHC produces both @.hi@ and @.o@ files: -- -- @ -- [\"*.o\",\"*.hi\"] '&%>' \\[o,hi] -> do -- let hs = o 'Development.Shake.FilePath.-<.>' \"hs\" -- 'Development.Shake.need' ... -- all files the .hs import -- 'Development.Shake.cmd' \"ghc -c\" [hs] -- @ -- -- However, in practice, it's usually easier to define rules with '%>' and make the @.hi@ depend -- on the @.o@. When defining rules that build multiple files, all the 'FilePattern' values must -- have the same sequence of @\/\/@ and @*@ wildcards in the same order. -- This function will create directories for the result files, if necessary. -- Think of it as the OR (@||@) equivalent of '%>'. (&%>) :: [FilePattern] -> ([FilePath] -> Action ()) -> Rules () ps &%> act | not $ compatible ps = error $ unlines $ ["All patterns to &%> must have the same number and position of // and * wildcards"] ++ ["* " ++ p ++ (if compatible [p, head ps] then "" else " (incompatible)") | p <- ps] | otherwise = do forM_ ps $ \p -> p %> \file -> do _ :: FilesA <- apply1 $ FilesQ $ map (FileQ . packU_ . filepathNormalise . unpackU_ . packU . substitute (extract p file)) ps return () (if all simple ps then id else priority 0.5) $ rule $ \(FilesQ xs_) -> let xs = map (unpackU . fromFileQ) xs_ in if not $ length xs == length ps && and (zipWith (?==) ps xs) then Nothing else Just $ do liftIO $ mapM_ (createDirectoryIfMissing True) $ nubOrd $ map takeDirectory xs trackAllow xs act xs getFileTimes "&%>" xs_ -- | Define a rule for building multiple files at the same time, a more powerful -- and more dangerous version of '&%>'. Think of it as the AND (@&&@) equivalent of '?>'. -- -- Given an application @test &?> ...@, @test@ should return @Just@ if the rule applies, and should -- return the list of files that will be produced. This list /must/ include the file passed as an argument and should -- obey the invariant: -- -- > forAll $ \x ys -> test x == Just ys ==> x `elem` ys && all ((== Just ys) . test) ys -- -- As an example of a function satisfying the invariaint: -- -- @ --test x | 'Development.Shake.FilePath.takeExtension' x \`elem\` [\".hi\",\".o\"] -- = Just ['Development.Shake.FilePath.dropExtension' x 'Development.Shake.FilePath.<.>' \"hi\", 'Development.Shake.FilePath.dropExtension' x 'Development.Shake.FilePath.<.>' \"o\"] --test _ = Nothing -- @ -- -- Regardless of whether @Foo.hi@ or @Foo.o@ is passed, the function always returns @[Foo.hi, Foo.o]@. (&?>) :: (FilePath -> Maybe [FilePath]) -> ([FilePath] -> Action ()) -> Rules () (&?>) test act = priority 0.5 $ do let norm = toStandard . normaliseEx let inputOutput suf inp out = ["Input" ++ suf ++ ":", " " ++ inp] ++ ["Output" ++ suf ++ ":"] ++ map (" "++) out let normTest = fmap (map norm) . test let checkedTest x = case normTest x of Nothing -> Nothing Just ys | x `notElem` ys -> error $ unlines $ ["Invariant broken in &?>, did not return the input (after normalisation)."] ++ inputOutput "" x ys Just ys | bad:_ <- filter ((/= Just ys) . normTest) ys -> error $ unlines $ ["Invariant broken in &?>, not equal for all arguments (after normalisation)."] ++ inputOutput "1" x ys ++ inputOutput "2" bad (fromMaybe ["Nothing"] $ normTest bad) Just ys -> Just ys isJust . checkedTest ?> \x -> do -- FIXME: Could optimise this test by calling rule directly and returning FileA Eq Eq Eq -- But only saves noticable time on uncommon Change modes _ :: FilesA <- apply1 $ FilesQ $ map (FileQ . packU_ . filepathNormalise . unpackU_ . packU) $ fromJust $ test x return () rule $ \(FilesQ xs_) -> let xs@(x:_) = map (unpackU . fromFileQ) xs_ in case checkedTest x of Just ys | ys == xs -> Just $ do liftIO $ mapM_ (createDirectoryIfMissing True) $ nubOrd $ map takeDirectory xs act xs getFileTimes "&?>" xs_ Just ys -> error $ "Error, &?> is incompatible with " ++ show xs ++ " vs " ++ show ys Nothing -> Nothing getFileTimes :: String -> [FileQ] -> Action FilesA getFileTimes name xs = do opts <- getShakeOptions ys <- liftIO $ mapM (storedValue opts) xs case sequence ys of Just ys -> return $ FilesA ys Nothing | not $ shakeCreationCheck opts -> return $ FilesA [] Nothing -> do let missing = length $ filter isNothing ys error $ "Error, " ++ name ++ " rule failed to build " ++ show missing ++ " file" ++ (if missing == 1 then "" else "s") ++ " (out of " ++ show (length xs) ++ ")" ++ concat ["\n " ++ unpackU x ++ if isNothing y then " - MISSING" else "" | (FileQ x,y) <- zip xs ys] shake-0.15.5/src/Development/Shake/Rules/File.hs0000644000000000000000000002656712560222036017512 0ustar0000000000000000{-# LANGUAGE MultiParamTypeClasses, GeneralizedNewtypeDeriving, DeriveDataTypeable, ScopedTypeVariables #-} {-# LANGUAGE ViewPatterns #-} module Development.Shake.Rules.File( need, needBS, needed, neededBS, want, trackRead, trackWrite, trackAllow, defaultRuleFile, (%>), (|%>), (?>), phony, (~>), phonys, -- * Internal only FileQ(..), FileA ) where import Control.Applicative import Control.Monad.Extra import Control.Monad.IO.Class import System.Directory import qualified Data.ByteString.Char8 as BS import qualified Data.HashSet as Set import Development.Shake.Core hiding (trackAllow) import qualified Development.Shake.Core as S import General.String import Development.Shake.ByteString import Development.Shake.Classes import Development.Shake.FilePath(toStandard) import Development.Shake.FilePattern import Development.Shake.FileInfo import Development.Shake.Types import Development.Shake.Errors import Data.Bits import Data.List import Data.Maybe import System.FilePath(takeDirectory) -- important that this is the system local filepath, or wrong slashes go wrong import System.IO.Unsafe(unsafeInterleaveIO) infix 1 %>, ?>, |%>, ~> newtype FileQ = FileQ {fromFileQ :: BSU} deriving (Typeable,Eq,Hashable,Binary,NFData) instance Show FileQ where show (FileQ x) = unpackU x data FileA = FileA {-# UNPACK #-} !ModTime {-# UNPACK #-} !FileSize FileHash deriving (Typeable,Eq) instance Hashable FileA where hashWithSalt salt (FileA a b c) = hashWithSalt salt a `xor` hashWithSalt salt b `xor` hashWithSalt salt c instance NFData FileA where rnf (FileA a b c) = rnf a `seq` rnf b `seq` rnf c instance Binary FileA where put (FileA a b c) = put a >> put b >> put c get = liftA3 FileA get get get instance Show FileA where show (FileA m s h) = "File {mod=" ++ show m ++ ",size=" ++ show s ++ ",digest=" ++ show h ++ "}" instance Rule FileQ FileA where storedValue ShakeOptions{shakeChange=c} (FileQ x) = do res <- getFileInfo x case res of Nothing -> return Nothing Just (time,size) | c == ChangeModtime -> return $ Just $ FileA time size fileInfoNeq Just (time,size) -> do hash <- unsafeInterleaveIO $ getFileHash x return $ Just $ FileA (if c == ChangeDigest then fileInfoNeq else time) size hash equalValue ShakeOptions{shakeChange=c} q (FileA x1 x2 x3) (FileA y1 y2 y3) = case c of ChangeModtime -> bool $ x1 == y1 ChangeDigest -> bool $ x2 == y2 && x3 == y3 ChangeModtimeOrDigest -> bool $ x1 == y1 && x2 == y2 && x3 == y3 _ -> if x1 == y1 then EqualCheap else if x2 == y2 && x3 == y3 then EqualExpensive else NotEqual where bool b = if b then EqualCheap else NotEqual storedValueError :: ShakeOptions -> Bool -> String -> FileQ -> IO FileA {- storedValueError opts False msg x | False && not (shakeOutputCheck opts) = do when (shakeCreationCheck opts) $ do whenM (isNothing <$> (storedValue opts x :: IO (Maybe FileA))) $ error $ msg ++ "\n " ++ unpackU (fromFileQ x) return $ FileA fileInfoEq fileInfoEq fileInfoEq -} storedValueError opts input msg x = fromMaybe def <$> storedValue opts2 x where def = if shakeCreationCheck opts || input then error err else FileA fileInfoNeq fileInfoNeq fileInfoNeq err = msg ++ "\n " ++ unpackU (fromFileQ x) opts2 = if not input && shakeChange opts == ChangeModtimeAndDigestInput then opts{shakeChange=ChangeModtime} else opts -- | This function is not actually exported, but Haddock is buggy. Please ignore. defaultRuleFile :: Rules () defaultRuleFile = priority 0 $ rule $ \x -> Just $ do opts <- getShakeOptions liftIO $ storedValueError opts True "Error, file does not exist and no rule available:" x -- | Add a dependency on the file arguments, ensuring they are built before continuing. -- The file arguments may be built in parallel, in any order. This function is particularly -- necessary when calling 'Development.Shake.cmd' or 'Development.Shake.command'. As an example: -- -- @ -- \"\/\/*.rot13\" '%>' \\out -> do -- let src = 'Development.Shake.FilePath.dropExtension' out -- 'need' [src] -- 'Development.Shake.cmd' \"rot13\" [src] \"-o\" [out] -- @ -- -- Usually @need [foo,bar]@ is preferable to @need [foo] >> need [bar]@ as the former allows greater -- parallelism, while the latter requires @foo@ to finish building before starting to build @bar@. need :: [FilePath] -> Action () need xs = (apply $ map (FileQ . packU_ . filepathNormalise . unpackU_ . packU) xs :: Action [FileA]) >> return () needBS :: [BS.ByteString] -> Action () needBS xs = (apply $ map (FileQ . packU_ . filepathNormalise) xs :: Action [FileA]) >> return () -- | Like 'need', but if 'shakeLint' is set, check that the file does not rebuild. -- Used for adding dependencies on files that have already been used in this rule. needed :: [FilePath] -> Action () needed xs = do opts <- getShakeOptions if isNothing $ shakeLint opts then need xs else neededCheck $ map packU xs neededBS :: [BS.ByteString] -> Action () neededBS xs = do opts <- getShakeOptions if isNothing $ shakeLint opts then needBS xs else neededCheck $ map packU_ xs neededCheck :: [BSU] -> Action () neededCheck (map (packU_ . filepathNormalise . unpackU_) -> xs) = do opts <- getShakeOptions pre <- liftIO $ mapM (storedValue opts . FileQ) xs post <- apply $ map FileQ xs :: Action [FileA] let bad = [ (x, if isJust a then "File change" else "File created") | (x, a, b) <- zip3 xs pre post, maybe NotEqual (\a -> equalValue opts (FileQ x) a b) a == NotEqual] case bad of [] -> return () (file,msg):_ -> liftIO $ errorStructured "Lint checking error - 'needed' file required rebuilding" [("File", Just $ unpackU file) ,("Error",Just msg)] "" -- | Track that a file was read by the action preceeding it. If 'shakeLint' is activated -- then these files must be dependencies of this rule. Calls to 'trackRead' are -- automatically inserted in 'LintTracker' mode. trackRead :: [FilePath] -> Action () trackRead = mapM_ (trackUse . FileQ . packU) -- | Track that a file was written by the action preceeding it. If 'shakeLint' is activated -- then these files must either be the target of this rule, or never referred to by the build system. -- Calls to 'trackWrite' are automatically inserted in 'LintTracker' mode. trackWrite :: [FilePath] -> Action () trackWrite = mapM_ (trackChange . FileQ . packU) -- | Allow accessing a file in this rule, ignoring any 'trackRead'\/'trackWrite' calls matching -- the pattern. trackAllow :: [FilePattern] -> Action () trackAllow ps = do opts <- getShakeOptions when (isJust $ shakeLint opts) $ S.trackAllow $ \(FileQ x) -> any (?== unpackU x) ps -- | Require that the argument files are built by the rules, used to specify the target. -- -- @ -- main = 'Development.Shake.shake' 'shakeOptions' $ do -- 'want' [\"Main.exe\"] -- ... -- @ -- -- This program will build @Main.exe@, given sufficient rules. All arguments to all 'want' calls -- may be built in parallel, in any order. -- -- This function is defined in terms of 'action' and 'need', use 'action' if you need more complex -- targets than 'want' allows. want :: [FilePath] -> Rules () want [] = return () want xs = action $ need xs root :: String -> (FilePath -> Bool) -> (FilePath -> Action ()) -> Rules () root help test act = rule $ \(FileQ x_) -> let x = unpackU x_ in if not $ test x then Nothing else Just $ do liftIO $ createDirectoryIfMissing True $ takeDirectory x act x opts <- getShakeOptions liftIO $ storedValueError opts False ("Error, rule " ++ help ++ " failed to build file:") $ FileQ x_ -- | Declare a phony action -- an action that does not produce a file, and will be rerun -- in every execution that requires it. You can demand 'phony' rules using 'want' \/ 'need'. -- Phony actions are never executed more than once in a single build run. -- -- Phony actions are intended to define command-line abbreviations. If you 'need' a phony action -- in a rule then every execution where that rule is required will rerun both the rule and the phony -- action. phony :: String -> Action () -> Rules () phony name act = phonys $ \s -> if s == name then Just act else Nothing -- | A predicate version of 'phony', return 'Just' with the 'Action' for the matching rules. phonys :: (String -> Maybe (Action ())) -> Rules () phonys act = rule $ \(FileQ x_) -> case act $ unpackU x_ of Nothing -> Nothing Just act -> Just $ do act return $ FileA fileInfoNeq fileInfoNeq fileInfoNeq -- | Infix operator alias for 'phony', for sake of consistency with normal -- rules. (~>) :: String -> Action () -> Rules () (~>) = phony -- | Define a rule to build files. If the first argument returns 'True' for a given file, -- the second argument will be used to build it. Usually '%>' is sufficient, but '?>' gives -- additional power. For any file used by the build system, only one rule should return 'True'. -- This function will create the directory for the result file, if necessary. -- -- @ -- (all isUpper . 'Development.Shake.FilePath.takeBaseName') '?>' \\out -> do -- let src = 'Development.Shake.FilePath.replaceBaseName' out $ map toLower $ takeBaseName out -- 'Development.Shake.writeFile'' out . map toUpper =<< 'Development.Shake.readFile'' src -- @ -- -- If the 'Action' completes successfully the file is considered up-to-date, even if the file -- has not changed. (?>) :: (FilePath -> Bool) -> (FilePath -> Action ()) -> Rules () (?>) test act = priority 0.5 $ root "with ?>" test act -- | Define a set of patterns, and if any of them match, run the associated rule. Defined in terms of '%>'. -- Think of it as the OR (@||@) equivalent of '%>'. (|%>) :: [FilePattern] -> (FilePath -> Action ()) -> Rules () (|%>) pats act = do let (simp,other) = partition simple pats case simp of [] -> return () [p] -> root "with |%>" (\x -> toStandard x == p) act ps -> let ps = Set.fromList pats in root "with |%>" (flip Set.member ps . toStandard) act unless (null other) $ let ps = map (?==) other in priority 0.5 $ root "with |%>" (\x -> any ($ x) ps) act -- | Define a rule that matches a 'FilePattern', see '?==' for the pattern rules. -- Patterns with no wildcards have higher priority than those with wildcards, and no file -- required by the system may be matched by more than one pattern at the same priority -- (see 'priority' and 'alternatives' to modify this behaviour). -- This function will create the directory for the result file, if necessary. -- -- @ -- \"*.asm.o\" '%>' \\out -> do -- let src = 'Development.Shake.FilePath.dropExtension' out -- 'need' [src] -- 'Development.Shake.cmd' \"as\" [src] \"-o\" [out] -- @ -- -- To define a build system for multiple compiled languages, we recommend using @.asm.o@, -- @.cpp.o@, @.hs.o@, to indicate which language produces an object file. -- I.e., the file @foo.cpp@ produces object file @foo.cpp.o@. -- -- Note that matching is case-sensitive, even on Windows. -- -- If the 'Action' completes successfully the file is considered up-to-date, even if the file -- has not changed. (%>) :: FilePattern -> (FilePath -> Action ()) -> Rules () (%>) test act = (if simple test then id else priority 0.5) $ root (show test) (test ?==) act shake-0.15.5/src/Development/Shake/Rules/Directory.hs0000644000000000000000000002545112560222036020566 0ustar0000000000000000{-# LANGUAGE MultiParamTypeClasses, GeneralizedNewtypeDeriving, ScopedTypeVariables, DeriveDataTypeable, RecordWildCards, FlexibleContexts #-} -- | Both System.Directory and System.Environment wrappers module Development.Shake.Rules.Directory( doesFileExist, doesDirectoryExist, getDirectoryContents, getDirectoryFiles, getDirectoryDirs, getEnv, getEnvWithDefault, removeFiles, removeFilesAfter, defaultRuleDirectory ) where import Control.Applicative import Control.Monad.Extra import Control.Monad.IO.Class import Data.Maybe import Data.Binary import Data.List import qualified System.Directory as IO import qualified System.Environment.Extra as IO import Development.Shake.Core import Development.Shake.Classes import Development.Shake.FilePath import Development.Shake.FilePattern import General.Extra import Prelude newtype DoesFileExistQ = DoesFileExistQ FilePath deriving (Typeable,Eq,Hashable,Binary,NFData) instance Show DoesFileExistQ where show (DoesFileExistQ a) = "doesFileExist " ++ showQuote a newtype DoesFileExistA = DoesFileExistA Bool deriving (Typeable,Eq,Hashable,Binary,NFData) instance Show DoesFileExistA where show (DoesFileExistA a) = show a newtype DoesDirectoryExistQ = DoesDirectoryExistQ FilePath deriving (Typeable,Eq,Hashable,Binary,NFData) instance Show DoesDirectoryExistQ where show (DoesDirectoryExistQ a) = "doesDirectoryExist " ++ showQuote a newtype DoesDirectoryExistA = DoesDirectoryExistA Bool deriving (Typeable,Eq,Hashable,Binary,NFData) instance Show DoesDirectoryExistA where show (DoesDirectoryExistA a) = show a newtype GetEnvQ = GetEnvQ String deriving (Typeable,Eq,Hashable,Binary,NFData) instance Show GetEnvQ where show (GetEnvQ a) = "getEnv " ++ showQuote a newtype GetEnvA = GetEnvA (Maybe String) deriving (Typeable,Eq,Hashable,Binary,NFData) instance Show GetEnvA where show (GetEnvA a) = maybe "" showQuote a data GetDirectoryQ = GetDir {dir :: FilePath} | GetDirFiles {dir :: FilePath, pat :: [FilePattern]} | GetDirDirs {dir :: FilePath} deriving (Typeable,Eq) newtype GetDirectoryA = GetDirectoryA [FilePath] deriving (Typeable,Eq,Hashable,Binary,NFData) instance Show GetDirectoryQ where show (GetDir x) = "getDirectoryContents " ++ showQuote x show (GetDirFiles a b) = "getDirectoryFiles " ++ showQuote a ++ " [" ++ unwords (map showQuote b) ++ "]" show (GetDirDirs x) = "getDirectoryDirs " ++ showQuote x instance Show GetDirectoryA where show (GetDirectoryA xs) = unwords $ map showQuote xs instance NFData GetDirectoryQ where rnf (GetDir a) = rnf a rnf (GetDirFiles a b) = rnf a `seq` rnf b rnf (GetDirDirs a) = rnf a instance Hashable GetDirectoryQ where hashWithSalt salt = hashWithSalt salt . f where f (GetDir x) = (0 :: Int, x, []) f (GetDirFiles x y) = (1, x, y) f (GetDirDirs x) = (2, x, []) instance Binary GetDirectoryQ where get = do i <- getWord8 case i of 0 -> liftM GetDir get 1 -> liftM2 GetDirFiles get get 2 -> liftM GetDirDirs get put (GetDir x) = putWord8 0 >> put x put (GetDirFiles x y) = putWord8 1 >> put x >> put y put (GetDirDirs x) = putWord8 2 >> put x instance Rule DoesFileExistQ DoesFileExistA where storedValue _ (DoesFileExistQ x) = fmap (Just . DoesFileExistA) $ IO.doesFileExist x instance Rule DoesDirectoryExistQ DoesDirectoryExistA where storedValue _ (DoesDirectoryExistQ x) = fmap (Just . DoesDirectoryExistA) $ IO.doesDirectoryExist x instance Rule GetEnvQ GetEnvA where storedValue _ (GetEnvQ x) = fmap (Just . GetEnvA) $ IO.lookupEnv x instance Rule GetDirectoryQ GetDirectoryA where storedValue _ x = fmap Just $ getDir x -- | This function is not actually exported, but Haddock is buggy. Please ignore. defaultRuleDirectory :: Rules () defaultRuleDirectory = do rule $ \(DoesFileExistQ x) -> Just $ liftIO $ fmap DoesFileExistA $ IO.doesFileExist x rule $ \(DoesDirectoryExistQ x) -> Just $ liftIO $ fmap DoesDirectoryExistA $ IO.doesDirectoryExist x rule $ \(x :: GetDirectoryQ) -> Just $ liftIO $ getDir x rule $ \(GetEnvQ x) -> Just $ liftIO $ fmap GetEnvA $ IO.lookupEnv x -- | Returns 'True' if the file exists. The existence of the file is tracked as a -- dependency, and if the file is created or deleted the rule will rerun in subsequent builds. -- -- You should not call 'doesFileExist' on files which can be created by the build system. doesFileExist :: FilePath -> Action Bool doesFileExist file = do DoesFileExistA res <- apply1 $ DoesFileExistQ file return res -- | Returns 'True' if the directory exists. The existence of the directory is tracked as a -- dependency, and if the directory is created or delete the rule will rerun in subsequent builds. -- -- You should not call 'doesDirectoryExist' on directories which can be created by the build system. doesDirectoryExist :: FilePath -> Action Bool doesDirectoryExist file = do DoesDirectoryExistA res <- apply1 $ DoesDirectoryExistQ file return res -- | Return 'Just' the value of the environment variable, or 'Nothing' -- if the variable is not set. The environment variable is tracked as a -- dependency, and if it changes the rule will rerun in subsequent builds. getEnv :: String -> Action (Maybe String) getEnv var = do GetEnvA res <- apply1 $ GetEnvQ var return res -- | Return the value of the environment variable, or the default value if it -- not set. Similar to 'getEnv'. getEnvWithDefault :: String -> String -> Action String getEnvWithDefault def var = fromMaybe def <$> getEnv var -- | Get the contents of a directory. The result will be sorted, and will not contain -- the entries @.@ or @..@ (unlike the standard Haskell version). The resulting paths will be relative -- to the first argument. The result is tracked as a -- dependency, and if it changes the rule will rerun in subsequent builds. -- -- It is usually simpler to call either 'getDirectoryFiles' or 'getDirectoryDirs'. getDirectoryContents :: FilePath -> Action [FilePath] getDirectoryContents x = getDirAction $ GetDir x -- | Get the files anywhere under a directory that match any of a set of patterns. -- For the interpretation of the patterns see '?=='. All results will be -- relative to the 'FilePath' argument. The result is tracked as a -- dependency, and if it changes the rule will rerun in subsequent builds. -- Some examples: -- -- > getDirectoryFiles "Config" ["//*.xml"] -- > -- All .xml files anywhere under the Config directory -- > -- If Config/foo/bar.xml exists it will return ["foo/bar.xml"] -- > getDirectoryFiles "Modules" ["*.hs","*.lhs"] -- > -- All .hs or .lhs in the Modules directory -- > -- If Modules/foo.hs and Modules/foo.lhs exist, it will return ["foo.hs","foo.lhs"] -- -- If you require a qualified file name it is often easier to use @\"\"@ as 'FilePath' argument, -- for example the following two expressions are equivalent: -- -- > fmap (map ("Config" )) (getDirectoryFiles "Config" ["//*.xml"]) -- > getDirectoryFiles "" ["Config//*.xml"] getDirectoryFiles :: FilePath -> [FilePattern] -> Action [FilePath] getDirectoryFiles x f = getDirAction $ GetDirFiles x f -- | Get the directories in a directory, not including @.@ or @..@. -- All directories are relative to the argument directory. The result is tracked as a -- dependency, and if it changes the rule will rerun in subsequent builds. -- -- -- > getDirectoryDirs "/Users" -- > -- Return all directories in the /Users directory -- > -- e.g. ["Emily","Henry","Neil"] getDirectoryDirs :: FilePath -> Action [FilePath] getDirectoryDirs x = getDirAction $ GetDirDirs x getDirAction x = do GetDirectoryA y <- apply1 x; return y contents :: FilePath -> IO [FilePath] -- getDirectoryContents "" is equivalent to getDirectoryContents "." on Windows, -- but raises an error on Linux. We smooth out the difference. contents x = fmap (filter $ not . all (== '.')) $ IO.getDirectoryContents $ if x == "" then "." else x answer :: [FilePath] -> GetDirectoryA answer = GetDirectoryA . sort getDir :: GetDirectoryQ -> IO GetDirectoryA getDir GetDir{..} = fmap answer $ contents dir getDir GetDirDirs{..} = fmap answer $ filterM f =<< contents dir where f x = IO.doesDirectoryExist $ dir x getDir GetDirFiles{..} = fmap answer $ concatMapM f $ directories pat where test = let ps = map (?==) pat in \x -> any ($ x) ps f (dir2,False) = do xs <- fmap (map (dir2 )) $ contents $ dir dir2 flip filterM xs $ \x -> if not $ test x then return False else fmap not $ IO.doesDirectoryExist $ dir x f (dir2,True) = do xs <- fmap (map (dir2 )) $ contents $ dir dir2 (dirs,files) <- partitionM (\x -> IO.doesDirectoryExist $ dir x) xs rest <- concatMapM (\d -> f (d, True)) dirs return $ filter test files ++ rest -- | Remove all files and directories that match any of the patterns within a directory. -- Some examples: -- -- @ -- 'removeFiles' \"output\" [\"\/\/*\"] -- 'removeFiles' \".\" [\"\/\/*.hi\",\"\/\/*.o\"] -- @ -- -- Any directories that become empty after deleting items from within them will themselves be deleted, -- up to (and including) the containing directory. -- This function is often useful when writing a @clean@ action for your build system, -- often as a 'phony' rule. removeFiles :: FilePath -> [FilePattern] -> IO () removeFiles dir pat = do b <- IO.doesDirectoryExist dir when b $ void $ f "" where -- because it is generate and match anything like ../ will be ignored, since we never generate .. -- therefore we can safely know we never escape the containing directory test = let ps = map (?==) pat in \x -> any ($ x) ps -- dir dir2 is the part to operate on, return True if you deleted the directory f :: FilePath -> IO Bool f dir2 | test dir2 = do IO.removeDirectoryRecursive $ dir dir2 return True f dir2 = do xs <- fmap (map (dir2 )) $ contents $ dir dir2 (dirs,files) <- partitionM (\x -> IO.doesDirectoryExist $ dir x) xs noDirs <- fmap and $ mapM f dirs let (del,keep) = partition test files forM del $ \d -> IO.removeFile $ dir d let die = noDirs && null keep && not (null xs) when die $ IO.removeDirectory $ dir dir2 return die -- | Remove files, like 'removeFiles', but executed after the build completes successfully. -- Useful for implementing @clean@ actions that delete files Shake may have open for building. removeFilesAfter :: FilePath -> [FilePattern] -> Action () removeFilesAfter a b = do putLoud $ "Will remove " ++ unwords b ++ " from " ++ a runAfter $ removeFiles a b shake-0.15.5/src/Development/Ninja/0000755000000000000000000000000012560222036015171 5ustar0000000000000000shake-0.15.5/src/Development/Ninja/Type.hs0000644000000000000000000000344712560222036016456 0ustar0000000000000000{-# LANGUAGE TupleSections #-} -- | The IO in this module is only to evaluate an envrionment variable, -- the 'Env' itself it passed around purely. module Development.Ninja.Type( Str, FileStr, Expr(..), Env, newEnv, askVar, askExpr, addEnv, addBind, addBinds, Ninja(..), newNinja, Build(..), Rule(..), ) where import Control.Applicative import Development.Ninja.Env import qualified Data.ByteString.Char8 as BS import Data.Maybe import Prelude type Str = BS.ByteString type FileStr = Str --------------------------------------------------------------------- -- EXPRESSIONS AND BINDINGS data Expr = Exprs [Expr] | Lit Str | Var Str deriving (Show,Eq) askExpr :: Env Str Str -> Expr -> IO Str askExpr e = f where f (Exprs xs) = fmap BS.concat $ mapM f xs f (Lit x) = return x f (Var x) = askVar e x askVar :: Env Str Str -> Str -> IO Str askVar e x = fmap (fromMaybe BS.empty) $ askEnv e x addBind :: Env Str Str -> Str -> Expr -> IO () addBind e k v = addEnv e k =<< askExpr e v addBinds :: Env Str Str -> [(Str, Expr)] -> IO () addBinds e bs = do bs <- mapM (\(a,b) -> (a,) <$> askExpr e b) bs mapM_ (uncurry $ addEnv e) bs --------------------------------------------------------------------- -- STRUCTURE data Ninja = Ninja {rules :: [(Str,Rule)] ,singles :: [(FileStr,Build)] ,multiples :: [([FileStr], Build)] ,phonys :: [(Str, [FileStr])] ,defaults :: [FileStr] ,pools :: [(Str, Int)] } deriving Show newNinja :: Ninja newNinja = Ninja [] [] [] [] [] [] data Build = Build {ruleName :: Str ,env :: Env Str Str ,depsNormal :: [FileStr] ,depsImplicit :: [FileStr] ,depsOrderOnly :: [FileStr] ,buildBind :: [(Str,Str)] } deriving Show data Rule = Rule {ruleBind :: [(Str,Expr)] } deriving Show shake-0.15.5/src/Development/Ninja/Parse.hs0000644000000000000000000000542112560222036016601 0ustar0000000000000000{-# LANGUAGE RecordWildCards, TupleSections #-} module Development.Ninja.Parse(parse) where import qualified Data.ByteString.Char8 as BS import Development.Ninja.Env import Development.Ninja.Type import Development.Ninja.Lexer import Control.Applicative import Control.Monad import Prelude parse :: FilePath -> Env Str Str -> IO Ninja parse file env = parseFile file env newNinja parseFile :: FilePath -> Env Str Str -> Ninja -> IO Ninja parseFile file env ninja = do lexes <- lexerFile $ if file == "-" then Nothing else Just file foldM (applyStmt env) ninja $ withBinds lexes withBinds :: [Lexeme] -> [(Lexeme, [(Str,Expr)])] withBinds [] = [] withBinds (x:xs) = (x,a) : withBinds b where (a,b) = f xs f (LexBind a b : rest) = let (as,bs) = f rest in ((a,b):as, bs) f xs = ([], xs) applyStmt :: Env Str Str -> Ninja -> (Lexeme, [(Str,Expr)]) -> IO Ninja applyStmt env ninja@Ninja{..} (key, binds) = case key of LexBuild outputs rule deps -> do outputs <- mapM (askExpr env) outputs deps <- mapM (askExpr env) deps binds <- mapM (\(a,b) -> (a,) <$> askExpr env b) binds let (normal,implicit,orderOnly) = splitDeps deps let build = Build rule env normal implicit orderOnly binds return $ if rule == BS.pack "phony" then ninja{phonys = [(x, normal ++ implicit ++ orderOnly) | x <- outputs] ++ phonys} else if length outputs == 1 then ninja{singles = (head outputs, build) : singles} else ninja{multiples = (outputs, build) : multiples} LexRule name -> return ninja{rules = (name, Rule binds) : rules} LexDefault xs -> do xs <- mapM (askExpr env) xs return ninja{defaults = xs ++ defaults} LexPool name -> do depth <- getDepth env binds return ninja{pools = (name, depth) : pools} LexInclude expr -> do file <- askExpr env expr parseFile (BS.unpack file) env ninja LexSubninja expr -> do file <- askExpr env expr e <- scopeEnv env parseFile (BS.unpack file) e ninja LexDefine a b -> do addBind env a b return ninja LexBind a b -> error $ "Unexpected binding defining " ++ BS.unpack a splitDeps :: [Str] -> ([Str], [Str], [Str]) splitDeps (x:xs) | x == BS.pack "|" = ([],a++b,c) | x == BS.pack "||" = ([],b,a++c) | otherwise = (x:a,b,c) where (a,b,c) = splitDeps xs splitDeps [] = ([], [], []) getDepth :: Env Str Str -> [(Str, Expr)] -> IO Int getDepth env xs = case lookup (BS.pack "depth") xs of Nothing -> return 1 Just x -> do x <- askExpr env x case BS.readInt x of Just (i, n) | BS.null n -> return i _ -> error $ "Could not parse depth field in pool, got: " ++ BS.unpack x shake-0.15.5/src/Development/Ninja/Lexer.hs0000644000000000000000000001653512560222036016616 0ustar0000000000000000{-# LANGUAGE PatternGuards #-} {-# OPTIONS_GHC -O2 #-} -- {-# OPTIONS_GHC -ddump-simpl #-} -- | Lexing is a slow point, the code below is optimised module Development.Ninja.Lexer(Lexeme(..), lexer, lexerFile) where import Data.Tuple.Extra import qualified Data.ByteString.Char8 as BS import qualified Data.ByteString.Unsafe as BS import Development.Ninja.Type import qualified Data.ByteString.Internal as Internal import System.IO.Unsafe import Data.Word import Foreign.Ptr import Foreign.Storable import GHC.Exts --------------------------------------------------------------------- -- LIBRARY BITS newtype Str0 = Str0 Str -- null terminated type S = Ptr Word8 chr :: S -> Char chr x = Internal.w2c $ unsafePerformIO $ peek x inc :: S -> S inc x = x `plusPtr` 1 {-# INLINE dropWhile0 #-} dropWhile0 :: (Char -> Bool) -> Str0 -> Str0 dropWhile0 f x = snd $ span0 f x {-# INLINE span0 #-} span0 :: (Char -> Bool) -> Str0 -> (Str, Str0) span0 f x = break0 (not . f) x {-# INLINE break0 #-} break0 :: (Char -> Bool) -> Str0 -> (Str, Str0) break0 f (Str0 bs) = (BS.unsafeTake i bs, Str0 $ BS.unsafeDrop i bs) where i = unsafePerformIO $ BS.unsafeUseAsCString bs $ \ptr -> do let start = castPtr ptr :: S let end = go start return $! Ptr end `minusPtr` start go s@(Ptr a) | c == '\0' || f c = a | otherwise = go (inc s) where c = chr s {-# INLINE break00 #-} -- The predicate must return true for '\0' break00 :: (Char -> Bool) -> Str0 -> (Str, Str0) break00 f (Str0 bs) = (BS.unsafeTake i bs, Str0 $ BS.unsafeDrop i bs) where i = unsafePerformIO $ BS.unsafeUseAsCString bs $ \ptr -> do let start = castPtr ptr :: S let end = go start return $! Ptr end `minusPtr` start go s@(Ptr a) | f c = a | otherwise = go (inc s) where c = chr s head0 :: Str0 -> Char head0 (Str0 x) = Internal.w2c $ BS.unsafeHead x tail0 :: Str0 -> Str0 tail0 (Str0 x) = Str0 $ BS.unsafeTail x list0 :: Str0 -> (Char, Str0) list0 x = (head0 x, tail0 x) take0 :: Int -> Str0 -> Str take0 i (Str0 x) = BS.takeWhile (/= '\0') $ BS.take i x --------------------------------------------------------------------- -- ACTUAL LEXER -- Lex each line separately, rather than each lexeme data Lexeme = LexBind Str Expr -- [indent]foo = bar | LexBuild [Expr] Str [Expr] -- build foo: bar | baz || qux (| and || are represented as Expr) | LexInclude Expr -- include file | LexSubninja Expr -- include file | LexRule Str -- rule name | LexPool Str -- pool name | LexDefault [Expr] -- default foo bar | LexDefine Str Expr -- foo = bar deriving Show isVar, isVarDot :: Char -> Bool isVar x = x == '-' || x == '_' || (x >= 'a' && x <= 'z') || (x >= 'A' && x <= 'Z') || (x >= '0' && x <= '9') isVarDot x = x == '.' || isVar x endsDollar :: Str -> Bool endsDollar x = BS.isSuffixOf (BS.singleton '$') x dropN :: Str0 -> Str0 dropN x = if head0 x == '\n' then tail0 x else x dropSpace :: Str0 -> Str0 dropSpace x = dropWhile0 (== ' ') x lexerFile :: Maybe FilePath -> IO [Lexeme] lexerFile file = fmap lexer $ maybe BS.getContents BS.readFile file lexer :: Str -> [Lexeme] lexer x = lexerLoop $ Str0 $ x `BS.append` BS.pack "\n\n\0" lexerLoop :: Str0 -> [Lexeme] lexerLoop c_x | (c,x) <- list0 c_x = case c of '\r' -> lexerLoop x '\n' -> lexerLoop x ' ' -> lexBind $ dropSpace x '#' -> lexerLoop $ dropWhile0 (/= '\n') x 'b' | Just x <- strip "uild " x -> lexBuild $ dropSpace x 'r' | Just x <- strip "ule " x -> lexRule $ dropSpace x 'd' | Just x <- strip "efault " x -> lexDefault $ dropSpace x 'p' | Just x <- strip "ool " x -> lexPool $ dropSpace x 'i' | Just x <- strip "nclude " x -> lexInclude $ dropSpace x 's' | Just x <- strip "ubninja " x -> lexSubninja $ dropSpace x '\0' -> [] _ -> lexDefine c_x where strip str (Str0 x) = if b `BS.isPrefixOf` x then Just $ Str0 $ BS.drop (BS.length b) x else Nothing where b = BS.pack str lexBind c_x | (c,x) <- list0 c_x = case c of '\r' -> lexerLoop x '\n' -> lexerLoop x '#' -> lexerLoop $ dropWhile0 (/= '\n') x '\0' -> [] _ -> lexxBind LexBind c_x lexBuild x | (outputs,x) <- lexxExprs True x , (rule,x) <- span0 isVar $ dropSpace x , (deps,x) <- lexxExprs False $ dropSpace x = LexBuild outputs rule deps : lexerLoop x lexDefault x | (files,x) <- lexxExprs False x = LexDefault files : lexerLoop x lexRule x = lexxName LexRule x lexPool x = lexxName LexPool x lexInclude x = lexxFile LexInclude x lexSubninja x = lexxFile LexSubninja x lexDefine x = lexxBind LexDefine x lexxBind ctor x | (var,x) <- span0 isVarDot x , ('=',x) <- list0 $ dropSpace x , (exp,x) <- lexxExpr False False $ dropSpace x = ctor var exp : lexerLoop x lexxBind _ x = error $ show ("parse failed when parsing binding", take0 100 x) lexxFile ctor x | (exp,rest) <- lexxExpr False False $ dropSpace x = ctor exp : lexerLoop rest lexxName ctor x | (name,rest) <- splitLineCont x = ctor name : lexerLoop rest lexxExprs :: Bool -> Str0 -> ([Expr], Str0) lexxExprs stopColon x = case lexxExpr stopColon True x of (a,c_x) | c <- head0 c_x, x <- tail0 c_x -> case c of ' ' -> first (a:) $ lexxExprs stopColon $ dropSpace x ':' | stopColon -> ([a], x) _ | stopColon -> error "expected a colon" '\r' -> a $: dropN x '\n' -> a $: x '\0' -> a $: c_x where Exprs [] $: x = ([], x) a $: x = ([a], x) {-# NOINLINE lexxExpr #-} lexxExpr :: Bool -> Bool -> Str0 -> (Expr, Str0) -- snd will start with one of " :\n\r" or be empty lexxExpr stopColon stopSpace = first exprs . f where exprs [x] = x exprs xs = Exprs xs special = case (stopColon, stopSpace) of (True , True ) -> \x -> x <= ':' && (x == ':' || x == ' ' || x == '$' || x == '\r' || x == '\n' || x == '\0') (True , False) -> \x -> x <= ':' && (x == ':' || x == '$' || x == '\r' || x == '\n' || x == '\0') (False, True ) -> \x -> x <= '$' && ( x == ' ' || x == '$' || x == '\r' || x == '\n' || x == '\0') (False, False) -> \x -> x <= '$' && ( x == '$' || x == '\r' || x == '\n' || x == '\0') f x = case break00 special x of (a,x) -> if BS.null a then g x else Lit a $: g x x $: (xs,y) = (x:xs,y) g x | head0 x /= '$' = ([], x) g x | c_x <- tail0 x, (c,x) <- list0 c_x = case c of '$' -> Lit (BS.singleton '$') $: f x ' ' -> Lit (BS.singleton ' ') $: f x ':' -> Lit (BS.singleton ':') $: f x '\n' -> f $ dropSpace x '\r' -> f $ dropSpace $ dropN x '{' | (name,x) <- span0 isVarDot x, not $ BS.null name, ('}',x) <- list0 x -> Var name $: f x _ | (name,x) <- span0 isVar c_x, not $ BS.null name -> Var name $: f x _ -> error $ "Unexpect $ followed by unexpected stuff" splitLineCont :: Str0 -> (Str, Str0) splitLineCont x = first BS.concat $ f x where f x = if not $ endsDollar a then ([a], b) else let (c,d) = f $ dropSpace b in (BS.init a : c, d) where (a,b) = splitLineCR x splitLineCR :: Str0 -> (Str, Str0) splitLineCR x = if BS.singleton '\r' `BS.isSuffixOf` a then (BS.init a, dropN b) else (a, dropN b) where (a,b) = break0 (== '\n') x shake-0.15.5/src/Development/Ninja/Env.hs0000644000000000000000000000202212560222036016251 0ustar0000000000000000{-# LANGUAGE PatternGuards #-} -- | A Ninja style environment, basically a linked-list of mutable hash tables. module Development.Ninja.Env( Env, newEnv, scopeEnv, addEnv, askEnv, fromEnv ) where import qualified Data.HashMap.Strict as Map import Data.Hashable import Data.IORef data Env k v = Env (IORef (Map.HashMap k v)) (Maybe (Env k v)) instance Show (Env k v) where show _ = "Env" newEnv :: IO (Env k v) newEnv = do ref <- newIORef Map.empty; return $ Env ref Nothing scopeEnv :: Env k v -> IO (Env k v) scopeEnv e = do ref <- newIORef Map.empty; return $ Env ref $ Just e addEnv :: (Eq k, Hashable k) => Env k v -> k -> v -> IO () addEnv (Env ref _) k v = modifyIORef ref $ Map.insert k v askEnv :: (Eq k, Hashable k) => Env k v -> k -> IO (Maybe v) askEnv (Env ref e) k = do mp <- readIORef ref case Map.lookup k mp of Just v -> return $ Just v Nothing | Just e <- e -> askEnv e k _ -> return Nothing fromEnv :: Env k v -> IO (Map.HashMap k v) fromEnv (Env ref _) = readIORef ref shake-0.15.5/src/Development/Ninja/All.hs0000644000000000000000000003040112560222036016233 0ustar0000000000000000{-# LANGUAGE RecordWildCards, ViewPatterns #-} module Development.Ninja.All(runNinja) where import Development.Ninja.Env import Development.Ninja.Type import Development.Ninja.Parse import Development.Shake hiding (addEnv) import Development.Shake.ByteString import Development.Shake.Errors import Development.Shake.Rules.File import Development.Shake.Rules.OrderOnly import General.Timing import qualified Data.ByteString as BS8 import qualified Data.ByteString.Char8 as BS import System.Directory import qualified Data.HashMap.Strict as Map import qualified Data.HashSet as Set import Data.Tuple.Extra import Control.Exception.Extra import Control.Monad import Data.Maybe import Data.Char import Data.List.Extra import System.Info.Extra runNinja :: FilePath -> [String] -> Maybe String -> IO (Maybe (Rules ())) runNinja file args (Just "compdb") = do dir <- getCurrentDirectory Ninja{..} <- parse file =<< newEnv rules <- return $ Map.fromList [r | r <- rules, BS.unpack (fst r) `elem` args] -- the build items are generated in reverse order, hence the reverse let xs = [(a,b,file,rule) | (a,b@Build{..}) <- reverse $ multiples ++ map (first return) singles , Just rule <- [Map.lookup ruleName rules], file:_ <- [depsNormal]] xs <- forM xs $ \(out,Build{..},file,Rule{..}) -> do -- the order of adding new environment variables matters env <- scopeEnv env addEnv env (BS.pack "out") (BS.unwords $ map quote out) addEnv env (BS.pack "in") (BS.unwords $ map quote depsNormal) addEnv env (BS.pack "in_newline") (BS.unlines depsNormal) forM_ buildBind $ \(a,b) -> addEnv env a b addBinds env ruleBind commandline <- fmap BS.unpack $ askVar env $ BS.pack "command" return $ CompDb dir commandline $ BS.unpack $ head depsNormal putStr $ printCompDb xs return Nothing runNinja file args (Just x) = error $ "Unknown tool argument, expected 'compdb', got " ++ x runNinja file args tool = do addTiming "Ninja parse" ninja@Ninja{..} <- parse file =<< newEnv return $ Just $ do needDeps <- return $ needDeps ninja -- partial application phonys <- return $ Map.fromList phonys singles <- return $ Map.fromList $ map (first filepathNormalise) singles multiples <- return $ Map.fromList [(x,(xs,b)) | (xs,b) <- map (first $ map filepathNormalise) multiples, x <- xs] rules <- return $ Map.fromList rules pools <- fmap Map.fromList $ forM ((BS.pack "console",1):pools) $ \(name,depth) -> fmap ((,) name) $ newResource (BS.unpack name) depth action $ needBS $ concatMap (resolvePhony phonys) $ if not $ null args then map BS.pack args else if not $ null defaults then defaults else Map.keys singles ++ Map.keys multiples (\x -> fmap (map BS.unpack . fst) $ Map.lookup (BS.pack x) multiples) &?> \out -> let out2 = map BS.pack out in build needDeps phonys rules pools out2 $ snd $ multiples Map.! head out2 (flip Map.member singles . BS.pack) ?> \out -> let out2 = BS.pack out in build needDeps phonys rules pools [out2] $ singles Map.! out2 resolvePhony :: Map.HashMap Str [Str] -> Str -> [Str] resolvePhony mp = f $ Left 100 where f (Left 0) x = f (Right []) x f (Right xs) x | x `elem` xs = error $ "Recursive phony involving " ++ BS.unpack x f a x = case Map.lookup x mp of Nothing -> [x] Just xs -> concatMap (f $ either (Left . subtract 1) (Right . (x:)) a) xs quote :: Str -> Str quote x | BS.any isSpace x = let q = BS.singleton '\"' in BS.concat [q,x,q] | otherwise = x build :: (Build -> [Str] -> Action ()) -> Map.HashMap Str [Str] -> Map.HashMap Str Rule -> Map.HashMap Str Resource -> [Str] -> Build -> Action () build needDeps phonys rules pools out build@Build{..} = do needBS $ concatMap (resolvePhony phonys) $ depsNormal ++ depsImplicit orderOnlyBS $ concatMap (resolvePhony phonys) depsOrderOnly case Map.lookup ruleName rules of Nothing -> error $ "Ninja rule named " ++ BS.unpack ruleName ++ " is missing, required to build " ++ BS.unpack (BS.unwords out) Just Rule{..} -> do env <- liftIO $ scopeEnv env liftIO $ do -- the order of adding new environment variables matters addEnv env (BS.pack "out") (BS.unwords $ map quote out) addEnv env (BS.pack "in") (BS.unwords $ map quote depsNormal) addEnv env (BS.pack "in_newline") (BS.unlines depsNormal) forM_ buildBind $ \(a,b) -> addEnv env a b addBinds env ruleBind applyRspfile env $ do commandline <- liftIO $ fmap BS.unpack $ askVar env $ BS.pack "command" depfile <- liftIO $ fmap BS.unpack $ askVar env $ BS.pack "depfile" deps <- liftIO $ fmap BS.unpack $ askVar env $ BS.pack "deps" description <- liftIO $ fmap BS.unpack $ askVar env $ BS.pack "description" pool <- liftIO $ askVar env $ BS.pack "pool" let withPool act = case Map.lookup pool pools of _ | BS.null pool -> act Nothing -> error $ "Ninja pool named " ++ BS.unpack pool ++ " not found, required to build " ++ BS.unpack (BS.unwords out) Just r -> withResource r 1 act when (description /= "") $ putNormal description let (cmdOpts, cmdProg, cmdArgs) = toCommand commandline if deps == "msvc" then do Stdout stdout <- withPool $ command cmdOpts cmdProg cmdArgs prefix <- liftIO $ fmap (fromMaybe $ BS.pack "Note: including file: ") $ askEnv env $ BS.pack "msvc_deps_prefix" needDeps build $ parseShowIncludes prefix $ BS.pack stdout else withPool $ command_ cmdOpts cmdProg cmdArgs when (depfile /= "") $ do when (deps /= "gcc") $ need [depfile] depsrc <- liftIO $ BS.readFile depfile needDeps build $ concatMap snd $ parseMakefile depsrc -- correct as per the Ninja spec, but breaks --skip-commands -- when (deps == "gcc") $ liftIO $ removeFile depfile needDeps :: Ninja -> Build -> [Str] -> Action () needDeps Ninja{..} = \build xs -> do -- eta reduced so 'builds' is shared opts <- getShakeOptions if isNothing $ shakeLint opts then needBS xs else do neededBS xs -- now try and statically validate needed will never fail -- first find which dependencies are generated files xs <- return $ filter (`Map.member` builds) xs -- now try and find them as dependencies let bad = xs `difference` allDependencies build case bad of [] -> return () x:_ -> liftIO $ errorStructured "Lint checking error - file in deps is generated and not a pre-dependency" [("File", Just $ BS.unpack x)] "" where builds :: Map.HashMap FileStr Build builds = Map.fromList $ singles ++ [(x,y) | (xs,y) <- multiples, x <- xs] -- do list difference, assuming a small initial set, most of which occurs early in the list difference :: [Str] -> [Str] -> [Str] difference [] ys = [] difference xs ys = f (Set.fromList xs) ys where f xs [] = Set.toList xs f xs (y:ys) | y `Set.member` xs = if Set.null xs2 then [] else f xs2 ys where xs2 = Set.delete y xs f xs (y:ys) = f xs ys -- find all dependencies of a rule, no duplicates, with all dependencies of this rule listed first allDependencies :: Build -> [FileStr] allDependencies rule = f Set.empty [] [rule] where f seen [] [] = [] f seen [] (x:xs) = f seen (map filepathNormalise $ depsNormal x ++ depsImplicit x ++ depsOrderOnly x) xs f seen (x:xs) rest | x `Set.member` seen = f seen xs rest | otherwise = x : f (Set.insert x seen) xs (maybeToList (Map.lookup x builds) ++ rest) applyRspfile :: Env Str Str -> Action a -> Action a applyRspfile env act = do rspfile <- liftIO $ fmap BS.unpack $ askVar env $ BS.pack "rspfile" rspfile_content <- liftIO $ askVar env $ BS.pack "rspfile_content" if rspfile == "" then act else flip actionFinally (ignore $ removeFile rspfile) $ do liftIO $ BS.writeFile rspfile rspfile_content act parseShowIncludes :: Str -> Str -> [FileStr] parseShowIncludes prefix out = [y | x <- BS.lines out, prefix `BS.isPrefixOf` x , let y = BS.dropWhile isSpace $ BS.drop (BS.length prefix) x , not $ isSystemInclude y] -- Dodgy, but ported over from the original Ninja isSystemInclude :: FileStr -> Bool isSystemInclude x = bsProgFiles `BS.isInfixOf` tx || bsVisStudio `BS.isInfixOf` tx where tx = BS8.map (\c -> if c >= 97 then c - 32 else c) x -- optimised toUpper that only cares about letters and spaces bsProgFiles = BS.pack "PROGRAM FILES" bsVisStudio = BS.pack "MICROSOFT VISUAL STUDIO" data CompDb = CompDb {cdbDirectory :: String ,cdbCommand :: String ,cdbFile :: String } deriving Show printCompDb :: [CompDb] -> String printCompDb xs = unlines $ ["["] ++ concat (zipWith f [1..] xs) ++ ["]"] where n = length xs f i CompDb{..} = [" {" ," \"directory\": " ++ g cdbDirectory ++ "," ," \"command\": " ++ g cdbCommand ++ "," ," \"file\": " ++ g cdbFile ," }" ++ (if i == n then "" else ",")] g = show toCommand :: String -> ([CmdOption], String, [String]) toCommand s -- On POSIX, Ninja does a /bin/sh -c, and so does Haskell in Shell mode (easy). | not isWindows = ([Shell], s, []) -- On Windows, Ninja passes the string directly to CreateProcess, -- but Haskell applies some escaping first. -- We try and get back as close to the original as we can, but it's very hacky | length s < 8000 = -- Using the "cmd" program adds overhead (I measure 7ms), and a limit of 8191 characters, -- but is the most robust, requiring no additional escaping. ([Shell], s, []) | (cmd,s) <- word1 s, map toUpper cmd `elem` ["CMD","CMD.EXE"], ("/c",s) <- word1 s = -- Given "cmd.exe /c " we translate to Shell, which adds cmd.exe -- (looked up on the current path) and /c to the front. CMake uses this rule a lot. -- Adding quotes around pieces are /c goes very wrong. ([Shell], s, []) | otherwise = -- It's a long command line which doesn't call "cmd /c". We reverse the escaping -- Haskell applies, but each argument will still gain quotes around it. let xs = splitArgs s in ([], head $ xs ++ [""], drop 1 xs) data State = Gap -- ^ Current in the gap between words | Word -- ^ Currently inside a space-separated argument | Quot -- ^ Currently inside a quote-surrounded argument -- | The process package contains a translate function, reproduced below. The aim is that after command line -- parsing we should get out mostly the same answer. splitArgs :: String -> [String] splitArgs = f Gap where f Gap (x:xs) | isSpace x = f Gap xs f Gap ('\"':xs) = f Quot xs f Gap [] = [] f Gap xs = f Word xs f Word (x:xs) | isSpace x = [] : f Gap xs f Quot ('\"':xs) = [] : f Gap xs f s ('\\':xs) | (length -> a, b) <- span (== '\\') xs = case b of '\"':xs | even a -> add (replicate (a `div` 2) '\\' ++ "\"") $ f s xs | otherwise -> add (replicate ((a+1) `div` 2) '\\') $ f s ('\"':xs) xs -> add (replicate (a+1) '\\') $ f s xs f s (x:xs) = add [x] $ f s xs f s [] = [] : [] add a (b:c) = (a++b):c add a [] = a:[] {- translate (cmd,args) = unwords $ f cmd : map f args where f x = '"' : snd (foldr escape (True,"\"") xs) escape '"' (_, str) = (True, '\\' : '"' : str) escape '\\' (True, str) = (True, '\\' : '\\' : str) escape c (_, str) = (False, c : str) -} shake-0.15.5/src/Development/Make/0000755000000000000000000000000012560222036015007 5ustar0000000000000000shake-0.15.5/src/Development/Make/Type.hs0000644000000000000000000000257112560222036016271 0ustar0000000000000000 module Development.Make.Type where import Control.Monad data Makefile = Makefile [Stmt] deriving Show data Stmt = Rule {targets :: Expr ,prerequisites :: Expr ,commands :: [Command] } | Assign {name :: String ,assign :: Assign ,expr :: Expr } deriving Show data Assign = Equals | ColonEquals | PlusEquals | QuestionEquals deriving Show data Expr = Apply String [Expr] | Concat [Expr] | Var String | Lit String deriving Show data Command = Expr Expr | String := Expr deriving Show descendExpr :: (Expr -> Expr) -> Expr -> Expr descendExpr f (Apply a b) = Apply a $ map f b descendExpr f (Concat xs) = Concat $ map f xs descendExpr f x = x descendExprM :: Monad m => (Expr -> m Expr) -> Expr -> m Expr descendExprM f (Apply a b) = Apply a `liftM` mapM f b descendExprM f (Concat xs) = Concat `liftM` mapM f xs descendExprM f x = return x transformExpr :: (Expr -> Expr) -> Expr -> Expr transformExpr f = f . descendExpr (transformExpr f) simplifyExpr :: Expr -> Expr simplifyExpr = transformExpr f where f (Concat xs) = case g xs of [] -> Lit "" [x] -> x xs -> Concat xs f x = x g (Concat x:xs) = g $ x ++ xs g (Lit x:Lit y:xs) = g $ Lit (x ++ y) : xs g (x:xs) = x : g xs g [] = [] shake-0.15.5/src/Development/Make/Rules.hs0000644000000000000000000000367312560222036016446 0ustar0000000000000000{-# LANGUAGE MultiParamTypeClasses, GeneralizedNewtypeDeriving, DeriveDataTypeable #-} -- | These are the additional rule types required by Makefile module Development.Make.Rules( need_, want_, defaultRuleFile_, (??>), Phony(..) ) where import Control.Monad.IO.Class import System.Directory import Development.Shake.Core import General.String import Development.Shake.Classes import Development.Shake.FilePath import Development.Shake.FileInfo infix 1 ??> --------------------------------------------------------------------- -- FILE_ RULES -- These are like file rules, but a rule may not bother creating the result -- Which matches the (insane) semantics of make -- If a file is not produced, it will rebuild forever newtype File_Q = File_Q BSU deriving (Typeable,Eq,Hashable,Binary,NFData) instance Show File_Q where show (File_Q x) = unpackU x newtype File_A = File_A (Maybe ModTime) deriving (Typeable,Eq,Hashable,Binary,Show,NFData) instance Rule File_Q File_A where storedValue _ (File_Q x) = fmap (fmap (File_A . Just . fst)) $ getFileInfo x defaultRuleFile_ :: Rules () defaultRuleFile_ = priority 0 $ rule $ \(File_Q x) -> Just $ liftIO $ do res <- getFileInfo x case res of Nothing -> error $ "Error, file does not exist and no rule available:\n " ++ unpackU x Just (mt,_) -> return $ File_A $ Just mt need_ :: [FilePath] -> Action () need_ xs = (apply $ map (File_Q . packU) xs :: Action [File_A]) >> return () want_ :: [FilePath] -> Rules () want_ = action . need_ data Phony = Phony | NotPhony deriving Eq (??>) :: (FilePath -> Bool) -> (FilePath -> Action Phony) -> Rules () (??>) test act = rule $ \(File_Q x_) -> let x = unpackU x_ in if not $ test x then Nothing else Just $ do liftIO $ createDirectoryIfMissing True $ takeDirectory x res <- act x liftIO $ fmap (File_A . fmap fst) $ if res == Phony then return Nothing else getFileInfo x_ shake-0.15.5/src/Development/Make/Parse.hs0000644000000000000000000000477312560222036016430 0ustar0000000000000000{-# LANGUAGE PatternGuards #-} module Development.Make.Parse(parse) where import Development.Make.Type import Data.Char import Data.List import Data.Maybe trim = dropWhile isSpace . reverse . dropWhile isSpace . reverse parse :: FilePath -> IO Makefile parse file = do src <- if file == "-" then getContents else readFile file return $ parseMakefile src parseMakefile :: String -> Makefile parseMakefile xs = Makefile $ rejoin $ concatMap parse $ map comments $ continuations $ lines xs where continuations (x:y:xs) | "\\" `isSuffixOf` x = continuations $ (init x ++ dropWhile isSpace y):xs continuations (x:xs) = x : continuations xs continuations [] = [] comments = takeWhile (/= '#') parse x | all isSpace x = [] | all isSpace $ take 1 x = [Right $ parseCommand $ trim x] | (a,b) <- break (== ';') x = Left (parseStmt a) : [Right $ parseCommand $ trim $ drop 1 b | b /= ""] rejoin (Left r@Rule{}:Right e:xs) = rejoin $ Left r{commands = commands r ++ [e]} : xs rejoin (Right e:xs) = error $ "Command must be under a rule: " ++ show e rejoin (Left r:xs) = r : rejoin xs rejoin [] = [] parseStmt :: String -> Stmt parseStmt x | (a,'=':b) <- break (== '=') x, ':' `notElem` a = if "+" `isSuffixOf` a then Assign (trim $ init a) PlusEquals (parseExpr $ trim b) else if "?" `isSuffixOf` a then Assign (trim $ init a) QuestionEquals (parseExpr $ trim b) else Assign (trim a) Equals (parseExpr $ trim b) | (a,':':b) <- break (== ':') x = case b of '=':b -> Assign (trim a) ColonEquals (parseExpr $ trim b) ':':'=':b -> Assign (trim a) ColonEquals (parseExpr $ trim b) _ -> Rule (parseExpr $ trim a) (parseExpr $ trim $ fromMaybe b $ stripPrefix ":" b) [] | otherwise = error $ "Invalid statement: " ++ x parseExpr :: String -> Expr parseExpr x = simplifyExpr $ Concat $ f x where f ('$':'$':x) = Lit "$" : f x f ('$':'(':xs) = case break (== ')') xs of (var,')':rest) -> parseVar var : f rest _ -> error $ "Couldn't find trailing `)' after " ++ xs f ('$':'{':xs) = case break (== '}') xs of (var,'}':rest) -> parseVar var : f rest _ -> error $ "Couldn't find trailing `}' after " ++ xs f ('$':x:xs) = Var [x] : f xs f (x:xs) = Lit [x] : f xs f [] = [] parseVar :: String -> Expr parseVar x = Var x parseCommand :: String -> Command parseCommand = Expr . parseExpr shake-0.15.5/src/Development/Make/Env.hs0000644000000000000000000000305412560222036016075 0ustar0000000000000000{-# LANGUAGE PatternGuards #-} -- | The IO in this module is only to evaluate an envrionment variable, -- the 'Env' itself it passed around purely. module Development.Make.Env(Env, newEnv, addEnv, askEnv) where import Development.Make.Type import Data.Maybe import qualified Data.HashMap.Strict as Map newtype Env = Env (Map.HashMap String (Assign,Expr)) newEnv :: [(String,String)] -> Env newEnv xs = Env $ Map.fromList [(a,(Equals,Lit b)) | (a,b) <- xs] addEnv :: String -> Assign -> Expr -> Env -> IO Env addEnv name ass val env@(Env e) = case ass of QuestionEquals -> if isJust $ Map.lookup name e then return env else addEnv name Equals val env Equals -> return $ Env $ Map.insert name (Equals,val) e ColonEquals -> do l <- askEnv env val; return $ Env $ Map.insert name (ColonEquals,Lit l) e PlusEquals -> case Map.lookup name e of Just (Equals,x) -> return $ Env $ Map.insert name (Equals,Concat [x,Lit " ",val]) e Just (ColonEquals,x) -> do l <- askEnv env val; return $ Env $ Map.insert name (ColonEquals,Concat [x,Lit " ",Lit l]) e _ -> addEnv name Equals val env askEnv :: Env -> Expr -> IO String askEnv (Env e) x = do res <- f [] x case simplifyExpr res of Lit x -> return x x -> error $ "Internal error in askEnv, " ++ show x where f seen (Var x) | x `elem` seen = error $ "Recursion in variables, " ++ show seen | Just (_,y) <- Map.lookup x e = f (x:seen) y | otherwise = return $ Lit "" f seen x = descendExprM (f seen) x shake-0.15.5/src/Development/Make/All.hs0000644000000000000000000001060212560222036016052 0ustar0000000000000000{-# LANGUAGE RecordWildCards, PatternGuards #-} module Development.Make.All(runMakefile) where import Development.Shake hiding (addEnv) import Development.Shake.FilePath import Development.Make.Parse import Development.Make.Env import Development.Make.Rules import Development.Make.Type import qualified System.Directory as IO import Data.List import Data.Maybe import Data.Tuple.Extra import Control.Monad import System.Process import System.Exit import System.Environment.Extra import Control.Monad.Trans.State.Strict runMakefile :: FilePath -> [String] -> IO (Rules ()) runMakefile file args = do env <- defaultEnv mk <- parse file rs <- eval env mk return $ do defaultRuleFile_ case filter (not . isPrefixOf "." . target) rs of Ruler x _ _ : _ | null args, '%' `notElem` x -> want_ [x] _ -> return () mapM_ (want_ . return) args convert rs data Ruler = Ruler {target :: String ,prereq :: (Env, Expr) -- Env is the Env at this point ,cmds :: (Env, [Command]) -- Env is the Env at the end } eval :: Env -> Makefile -> IO [Ruler] eval env (Makefile xs) = do (rs, env) <- runStateT (fmap concat $ mapM f xs) env return [r{cmds=(env,snd $ cmds r)} | r <- rs] where f :: Stmt -> StateT Env IO [Ruler] f Assign{..} = do e <- get e <- liftIO $ addEnv name assign expr e put e return [] f Rule{..} = do e <- get target <- liftIO $ fmap words $ askEnv e targets return $ map (\t -> Ruler t (e, prerequisites) (undefined, commands)) target convert :: [Ruler] -> Rules () convert rs = match ??> run where match s = any (isJust . check s) rs check s r = makePattern (target r) s run target = do let phony = has False ".PHONY" target let silent = has True ".SILENT" target (deps, cmds) <- fmap (first concat . second concat . unzip) $ forM rs $ \r -> case check target r of Nothing -> return ([], []) Just op -> do let (preEnv,preExp) = prereq r env <- liftIO $ addEnv "@" Equals (Lit target) preEnv pre <- liftIO $ askEnv env preExp vp <- liftIO $ fmap splitSearchPath $ askEnv env $ Var "VPATH" pre <- mapM (vpath vp) $ words $ op pre return (pre, [cmds r]) mapM_ (need_ . return) deps forM_ cmds $ \(env,cmd) -> do env <- liftIO $ addEnv "@" Equals (Lit target) env env <- liftIO $ addEnv "^" Equals (Lit $ unwords deps) env env <- liftIO $ addEnv "<" Equals (Lit $ head $ deps ++ [""]) env forM_ cmd $ \c -> case c of Expr c -> (if silent then quietly else id) $ execCommand =<< liftIO (askEnv env c) return $ if phony then Phony else NotPhony has auto name target = or [(null ws && auto) || target `elem` ws | Ruler t (_,Lit s) _ <- rs, t == name, let ws = words s] execCommand :: String -> Action () execCommand x = do res <- if "@" `isPrefixOf` x then sys $ drop 1 x else putNormal x >> sys x when (res /= ExitSuccess) $ error $ "System command failed: " ++ x where sys = quietly . traced (unwords $ take 1 $ words x) . system makePattern :: String -> FilePath -> Maybe (String -> String) makePattern pat v = case break (== '%') pat of (pre,'%':post) -> if pre `isPrefixOf` v && post `isSuffixOf` v && rest >= 0 then Just $ concatMap (\x -> if x == '%' then subs else [x]) else Nothing where rest = length v - (length pre + length post) subs = take rest $ drop (length pre) v otherwise -> if pat == v then Just id else Nothing vpath :: [FilePath] -> FilePath -> Action FilePath vpath [] y = return y vpath (x:xs) y = do b <- doesFileExist $ x y if b then return $ x y else vpath xs y defaultEnv :: IO Env defaultEnv = do exePath <- getExecutablePath env <- getEnvironment cur <- IO.getCurrentDirectory return $ newEnv $ ("EXE",if null exe then "" else "." ++ exe) : ("MAKE",normaliseEx exePath) : ("CURDIR",normaliseEx cur) : env shake-0.15.5/html/0000755000000000000000000000000012560222036012025 5ustar0000000000000000shake-0.15.5/html/shake-util.js0000644000000000000000000000706112560222036014435 0ustar0000000000000000/*jsl:option explicit*/ "use strict"; ///////////////////////////////////////////////////////////////////// // JQUERY EXTENSIONS jQuery.fn.enable = function (x) { // Set the values to enabled/disabled return this.each(function () { if (x) $(this).removeAttr('disabled'); else $(this).attr('disabled','disabled'); }); }; jQuery.getParameters = function() { // From http://stackoverflow.com/questions/901115/get-querystring-values-with-jquery/3867610#3867610 var params = {}; var a = /\+/g; // Regex for replacing addition symbol with a space var r = /([^&=]+)=?([^&]*)/g; var d = function (s) { return decodeURIComponent(s.replace(a, " ")); }; var q = window.location.search.substring(1); while (true) { var e = r.exec(q); if (!e) break; params[d(e[1])] = d(e[2]); } return params; }; ///////////////////////////////////////////////////////////////////// // STRING FORMATTING function /* export */ showTime(x) // :: Double -> String { function digits(x){var s = String(x); return s.length === 1 ? "0" + s : s;} if (x >= 3600) { x = Math.round(x / 60); return Math.floor(x / 60) + "h" + digits(x % 60) + "m"; } else if (x >= 60) { x = Math.round(x); return Math.floor(x / 60) + "m" + digits(x % 60) + "s"; } else return x.toFixed(2) + "s"; } function /* export */ showPerc(x) // :: Double -> String { return (x*100).toFixed(2) + "%"; } function /* export */ plural(n,not1,is1) // :: Int -> Maybe String -> Maybe String -> String { return n === 1 ? (is1 === undefined ? "" : is1) : (not1 === undefined ? "s" : not1); } ///////////////////////////////////////////////////////////////////// // MISC function sum(xs) // :: Num a => [a] -> a { var res = 0; for (var i = 0; i < xs.length; i++) res += xs[i]; return res; } function testRegExp(r, s) { if (typeof r === "string") return s.indexOf(r) !== -1; else return r.test(s); } function execRegExp(r, s) { if (typeof r === "string") return s.indexOf(r) === -1 ? null : []; else return r.exec(s); } function listEq(xs, ys) // :: Eq a => [a] -> [a] -> Bool { if (xs.length !== ys.length) return false; for (var i = 0; i < xs.length; i++) { if (xs[i] !== ys[i]) return false; } return true; } function cache(str, f) // :: (k -> String) -> (k -> v) -> (k -> v) { var cache = {}; return function(k){ var s = str(k); if (!(s in cache)) cache[s] = f(k); return cache[s]; }; } function recordEq(xs, ys) // :: Record -> Record -> Bool { function f(a,b) { for (var s in a) { if (a[s] !== b[s]) return false; } return true; } return f(xs,ys) && f(ys,xs); } function recordCopy(xs) // :: Record -> Record { var res = {}; for (var s in xs) res[s] = xs[s]; return res; } function recordUnion(xs,ys) // :: Record -> Record -> Record -- left biased { var res = recordCopy(ys); for (var s in xs) res[s] = xs[s]; return res; } function concatNub(xs) // :: Eq a => [[a]] -> [a] { var res = []; var seen = {}; for (var i = 0; i < xs.length; i++) { var x = xs[i]; for (var j = 0; j < x.length; j++) { var e = x[j]; if (!(e in seen)) { seen[e] = null; res.push(e); } } } return res; } shake-0.15.5/html/shake-ui.js0000644000000000000000000002544212560222036014100 0ustar0000000000000000/*jsl:option explicit*/ /*jsl:import shake-logic.js*/ "use strict"; // Report // {mode :: String // ,query :: String // ,sort :: String // ,sortRev :: Bool // } // var defaultMode = "summary"; var defaultQuery = ""; var defaultSort = "time"; var currentTable = null; var currentPlot = null; ///////////////////////////////////////////////////////////////////// // GLOBAL DATA var shakeSummary = summary(shake); var shakeEx = prepare(shakeSummary, shake); var report = {}; // :: Report function reportURL(report) // :: Report -> URL { return "?mode=" + report.mode + (report.query === defaultQuery ? "" : "&query=" + escape(report.query).replace(/\+/g,"%2B")) + (report.sort === undefined || (!report.sortRev && report.sort === defaultSort) ? "" : "&sort=" + (report.sortRev ? "!" : "") + report.sort); } function urlReport() // :: IO Report { var params = $.getParameters(); var sort = params.sort || defaultSort; var sortRev = false; if (sort.substr(0,1) == "!") { sort = sort.substr(1); sortRev = true; } return {mode: params.mode || defaultMode ,query: params.query || defaultQuery ,sort: sort ,sortRev: sortRev }; } function enteredReport() { return {mode: $("#mode").val() ,query: $("#query").val() ,sort: report.sort ,sortRev: report.sortRev }; } function setReport(r, replace, run) { var changed = false; var report2 = recordUnion(r, report); $("#mode").val(report2.mode); $("#query").val(report2.query); $("#run").enable(false).attr("title", "The current query is displayed"); if (recordEq(report,report2)) return; report = report2; if (window.history) { var title = report.mode + (report.query === "" ? "" : ": " + report.query); var url = reportURL(report); if (replace) window.history.replaceState(report, title, url); else window.history.pushState(report, title, url); } $("#link").attr("href", reportURL(report)); if (run) runReport(); } ///////////////////////////////////////////////////////////////////// // TABLE SHOWING var rightAlign = {count:null, time:null, cost:null, run:null, leaf:null, unchanged:null}; var twoColumns = {cost:null, time:null}; var defaultRevSort = {run:null, name:null}; function tableSort(x) { if (report.sort === x) setReport({sortRev: !report.sortRev}, true, false); else setReport({sort: x, sortRev: x in defaultRevSort}, true, false); showTable(currentTable); } function showTable(xs) { currentTable = xs; if (xs.length === 0) { $("#output").html("No data found"); return; } if (!(report.sort in xs[0])) setReport({sort:defaultSort, sortRev:false}, true, false); xs.sort(function(a,b){return (report.sortRev ? -1 : 1) * (b[report.sort] > a[report.sort] ? 1 : -1);}); var res = ""; for (var s in xs[0]) { if (s === "back" || s === "text") continue; res += s in twoColumns ? ""; } res += ""; for (var i = 0; i < xs.length; i++) { var x = xs[i]; res += ""; if (s === "count") res += x[s] + " ×"; else if (s === "time" || s === "cost") res += showTime(x[s]) + ""; } res += ""; } res += "
" + s; if (s === report.sort) res += " " + (report.sortRev ? "▲" : "▼") + ""; res += "
" + showPerc(x[s] / shakeSummary.sumExecution); else res += x[s]; res += "
"; $("#output").html(res); } ///////////////////////////////////////////////////////////////////// // SHOW A PLOT function showPlot(series, options) { var $output = $("#output"); var width = $output.width(); var height = $output.height(); if (series === null && options === null) { if (width === currentPlot.width && height === currentPlot.height) return; series = currentPlot.series; options = currentPlot.options; } currentPlot = {series:series, options:options, width:width, height:height}; // Fudge factors to get it displaying nicely, seems Flot goes outside its bounds var div = $("
").width(width - 20).height(height - 10); $("#output").html("").append(div); $.plot(div, series, options); } window.onresize = function(){ if (currentPlot !== null) showPlot(null, null); }; ///////////////////////////////////////////////////////////////////// // RUNNING function runReport() { currentTable = null; currentPlot = null; try { switch(report.mode) { case "summary": var res = showSummary(shakeSummary); var s = $("#welcome").html(); s += "
    "; for (var i = 0; i < res.length; i++) s += "
  • " + res[i] + "
  • "; s += "
"; s += "

Generated by Shake " + version + ".

"; $("#output").html(s); break; case "cmd-plot": var xs = commandPlot(shakeEx, report.query, 100); var ys = []; for (var s in xs) { var x = xs[s].items; var data = []; for (var j = 0; j < x.length; j++) data.push([j, x[j]]); ys.push({label:s, values:x, data:data, color:xs[s].back, avg:sum(x) / x.length}); } if (ys.length === 0) { $("#output").html("No data found, " + (shakeEx.summary.countTraceLast === 0 ? "there were no traced commands in the last run." : "perhaps your filter is too restrictive?")); } else { ys.sort(function(a,b){return a.avg - b.avg;}); showPlot(ys, { legend: {show:true, position:"nw", sorted:"reverse"}, series: {stack:true, lines:{lineWidth:0,fill:1}}, yaxis: {min:0}, xaxis: {tickFormatter: function (i){return showTime(shakeSummary.maxTraceStopLast * i / 100);}} }); } break; case "cmd-table": showTable(commandTable(shakeEx, report.query)); break; case "rule-table": showTable(ruleTable(shakeEx, report.query)); break; case "rule-graph": var xs = ruleGraph(shakeEx, report.query); if (xs.length > 250) $("#output").html("Viewing a graph with > 250 nodes is not supported, and you have " + xs.length + " nodes. Try grouping more aggressively"); else if (typeof Viz === 'undefined') $("#output").html("Profile reports do not seem to have been built with GraphViz support, this feature is unavailable."); else { var res = "digraph \"\"{"; res += "graph[nodesep=0.15,ranksep=0.3];"; res += "node[fontname=\"sans-serif\",fontsize=9,penwidth=0.5,height=0,width=0];"; res += "edge[penwidth=0.5,arrowsize=0.5];"; for (var i = 0; i < xs.length; i++) { res += "a" + i + "[label=\"" + xs[i].name.split("\\").join("\\\\").split("\"").join("\\\"") + "\""; if (xs[i].back) res += ",style=filled,color=\"" + xs[i].back + "\""; if (xs[i].text) res += ",fontcolor=\"" + xs[i].text + "\""; res += "];"; var parents = xs[i].parents; for (var j = 0; j < parents.length; j++) res += "a" + i + "->a" + parents[j] + ";"; var ancestors = xs[i].ancestors; for (var j = 0; j < ancestors.length; j++) res += "a" + i + "->a" + ancestors[j] + "[style=dashed];"; } res += "}"; $("#output").html(Viz(res,"svg")); } break; case "help": $("#output").html($("#help").html()); break; default: throw "Unknown report type: " + report.mode; break; } } catch (e) { if (!(e && e.user)) throw e; $("#output").html($("#error").html()); for (var s in e) $("#output ." + s).text(e[s]); } } ///////////////////////////////////////////////////////////////////// // STATE NAVIGATION function example(mode,query) { setReport({mode:mode, query:query, sortRev:false, sort:defaultSort}, false, true); return false; } $(function(){ setReport(urlReport(), true, true); $("#mode,#query").bind("input change",function(){ var mode = $("#mode").val(); var query = $("#query").val(); var enable = mode !== report.mode || query !== report.query; $("#run").enable(enable).attr("title", enable ? "" : "The current query is displayed"); $("#link").attr("href", reportURL(enteredReport())); }); $("#run").click(function(){ setReport(enteredReport(), false, true); }); window.onpopstate = function (e){ setReport(urlReport(), true, true); }; }); ///////////////////////////////////////////////////////////////////// // TEMPLATES $(function(){ $("a.example").each(function(){ var mode = $(this).attr("data-mode"); var query = $(this).attr("data-query"); if (query === undefined) query = $(this).text(); var href = reportURL({mode:mode, query:query}); var onclick = "return example(unescape('" + escape(mode) + "'),unescape('" + escape(query) + "'));"; $(this).attr("href", href).attr("target","_blank")[0].setAttribute("onclick",onclick); }); $("a.shake").each(function(){ var href = "http://hackage.haskell.org/packages/archive/shake/latest/doc/html/Development-Shake.html#v:" + $(this).text().replace("'","-39-"); $(this).attr("href", href).attr("target","_blank"); }); }); shake-0.15.5/html/shake-progress.js0000644000000000000000000000220612560222036015320 0ustar0000000000000000/*jsl:option explicit*/ "use strict"; // Data // {name :: String // ,values :: [Progress] // } // // Progress // {idealSecs :: Double // ,idealPerc :: Double // ,actualSecs :: Double // ,actualPerc :: Double // } $(function(){ $(".version").html("Generated by Shake " + version + "."); $("#output").html(""); for (var i = 0; i < shake.length; i++) { var x = shake[i]; var actual = []; var ideal = []; // Start at t = 5 seconds, since the early progress jumps a lot for (var t = 5; t < x.values.length; t++) { var y = x.values[t]; actual.push([y.idealSecs, y.actualSecs]); ideal.push([y.idealSecs, y.idealSecs]); } var ys = [{data:ideal, color:"gray"}, {label:x.name, data:actual, color:"red"}]; var div = $("
"); $("#output").append(div); $.plot(div, ys, { xaxis: { transform: function (v) { return -v; }, inverseTransform: function (v) { return -v; } } }); } }) shake-0.15.5/html/shake-logic.js0000644000000000000000000003727712560222036014571 0ustar0000000000000000/*jsl:option explicit*/ /*jsl:import shake-util.js*/ "use strict"; // Data // {name :: String // ,built :: Int // ,changed :: Int // ,depends :: [Int] // ,execution :: Double // ,traces :: [Trace] // } // // Trace // {start :: Double // ,stop :: Double // ,command :: String // } function bools(x,y) { return x === "" ? y : x === y ? x : "both"; } function colorAnd(c1, c2) { return c1 === null ? c2 : c1 === c2 ? c1 : undefined; } ////////////////////////////////////////////////////////////////////// // SUMMARY function /* export */ summary(dat) // :: Data -> Summary { // Summary statistics var res = {count : 0 // :: Int, number of rules run ,countLast : 0 // :: Int, number of rules run in the last run ,highestRun : 0 // :: Int, highest run you have seen (add 1 to get the count of runs) ,sumExecution : 0 // :: Seconds, build time in total ,maxExecution : 0 // :: Seconds, longest build rule ,maxExecutionName : "" // :: String, longest build rule ,countTrace : 0, countTraceLast : 0 // :: Int, traced commands run ,sumTrace : 0, sumTraceLast : 0 // :: Seconds, time running traced commands ,maxTrace : 0 // :: Seconds, longest traced command ,maxTraceName : "" // :: String, longest trace command ,maxTraceStopLast : 0 // :: Seconds, time the last traced command stopped }; // Fold over dat to produce the summary res.count = dat.length; for (var i = 0; i < dat.length; i++) { var isLast = dat[i].built === 0; res.countLast += isLast ? 1 : 0; res.sumExecution += dat[i].execution; res.maxExecution = Math.max(res.maxExecution, dat[i].execution); if (res.maxExecution === dat[i].execution) res.maxExecutionName = dat[i].name; res.highestRun = Math.max(res.highestRun, dat[i].changed); // changed is always greater or equal to built var traces = dat[i].traces; if (!traces) continue; for (var j = 0; j < traces.length; j++) { var time = traces[j].stop - traces[j].start; res.countTrace += 1; res.countTraceLast += isLast ? 1 : 0; res.sumTrace += time; res.sumTraceLast += isLast ? time : 0; res.maxTrace = Math.max(res.maxTrace, time); if (res.maxTrace == time) res.maxTraceName = traces[j].command; res.maxTraceStopLast = Math.max(res.maxTraceStopLast, isLast ? traces[j].stop : 0); } } return res; } function /* export */ showSummary(sum) // Summary -> [String] { return ["This database has tracked " + (sum.highestRun+1) + " run" + plural(sum.highestRun+1) + "." ,"There are " + sum.count + " rules (" + sum.countLast + " rebuilt in the last run)." ,"Building required " + sum.countTrace + " traced commands (" + sum.countTraceLast + " in the last run)." ,"The total (unparallelised) build time is " + showTime(sum.sumExecution) + " of which " + showTime(sum.sumTrace) + " is traced commands." ,"The longest rule takes " + showTime(sum.maxExecution) + " (" + sum.maxExecutionName + ") and the longest traced command takes " + showTime(sum.maxTrace) + " (" + sum.maxTraceName + ")." ,"Last run gave an average parallelism of " + (sum.maxTraceStopLast === 0 ? 0 : sum.sumTraceLast / sum.maxTraceStopLast).toFixed(2) + " times over " + showTime(sum.maxTraceStopLast) + "." ]; } ///////////////////////////////////////////////////////////////////// // PREPARATION // Mutate the input data, adding in rdeps, being the 1-level reverse dependencies function addRdeps(dat) // Data -> Mutate Data{+rdeps} { // find the reverse dependencies var rdeps = []; for (var i = 0; i < dat.length; i++) rdeps[i] = {}; for (var i = 0; i < dat.length; i++) { var deps = dat[i].depends; for (var j = 0, n = deps.length; j < n; j++) rdeps[deps[j]][i] = true; } for (var i = 0; i < rdeps.length; i++) { var ans = []; for (var j in rdeps[i]) ans.push(Number(j)); dat[i].rdeps = ans; } } // Given an array of indices, calculate the cost to rebuild if all of them change // You must call addRdeps and addCost first function calcRebuildCosts(dat,xs) // Data{+rdeps+cost} -> [Int] -> Double { var seen = {}; var tot = 0; function f(i) { if (seen[i]) return; seen[i] = true; tot += dat[i].execution; var deps = dat[i].rdeps; for (var j = 0, n = deps.length; j < n; j++) f(deps[j]); } if (xs.length === 1 && dat[xs[0]].depends.length === 1) tot = dat[dat[xs[0]].depends[0]].cost + dat[xs[0]].execution; else { for (var i = 0, n = xs.length; i < n; i++) f(xs[i]); } return tot; } // Mutate the dat data, adding in cost, being the cost to rebuild if this item changes function addCost(dat) // Data -> Mutate Data{+cost} { for (var i = 0; i < dat.length; i++) dat[i].cost = calcRebuildCosts(dat, [i]); } function prepare(sum, dat) // Data -> DataEx { addRdeps(dat); addCost(dat); function toHash(r){return typeof r === "string" ? "$" + r : "/" + r.source;} function findDirect(key) { var c = cache(toHash, function(r){ var want = {}; for (var i = 0; i < dat.length; i++) { if (testRegExp(r, dat[i].name)) { var deps = dat[i][key]; for (var j = 0; j < deps.length; j++) want[deps[j]] = true; } } return want; }); return function(i,r) { if (i in c(r)) return true; else return false; }; } function findTransitive(key, dirFwd) { var c = cache(toHash, function(r){ var want = {}; for (var i = 0; i < dat.length; i++) { var j = dirFwd ? i : dat.length - 1 - i; if ((j in want) || testRegExp(r,dat[j].name)) { want[j] = true; var deps = dat[j][key]; for (var k = 0; k < deps.length; k++) want[deps[k]] = true; } } return want; }); return function(i,r){return i in c(r);}; } return {original: dat ,summary: sum ,dependsOnThis: findDirect("rdeps") ,thisDependsOn: findDirect("depends") ,dependsOnThisTransitive: findTransitive("depends", false) ,thisDependsOnTransitive: findTransitive("rdeps", true) }; } ///////////////////////////////////////////////////////////////////// // RULES function ruleFilter(dat, query) // DataEx -> Query -> Dict String {items: [DataIndex], color: color} { queryData = dat; var f = readQuery(query); var res = {}; for (queryKey = 0; queryKey < dat.original.length; queryKey++) { queryVal = dat.original[queryKey]; queryName = queryVal.name; queryGroup = null; queryBackColor = null; queryTextColor = null; if (f()) { if (queryGroup === null) queryGroup = queryName; if (!(queryGroup in res)) res[queryGroup] = {items: [queryKey], text: queryTextColor, back: queryBackColor}; else { var c = res[queryGroup]; c.items.push(queryKey); c.text = colorAnd(c.text, queryTextColor); c.back = colorAnd(c.back, queryBackColor); } } } return res; } function ruleTable(dat, query) // DataEx -> Query -> [Record] { var res = ruleFilter(dat, query); var ans = []; for (var s in res) { var xs = res[s].items; var time = 0; var leaf = ""; var unchanged = ""; var run = 100000; for (var i = 0; i < xs.length; i++) { var x = dat.original[xs[i]]; time += x.execution; leaf = bools(leaf, x.depends.length === 0); unchanged = bools(unchanged, x.changed !== x.built); run = Math.min(run,x.built); } ans.push({name:s, count:xs.length, time:time, back:res[s].back, text:res[s].text, cost:calcRebuildCosts(dat.original,xs), leaf:leaf, run:run, unchanged:unchanged}); } return ans; } function ruleGraph(dat, query) // DataEx -> Query -> [Record] { var res = ruleFilter(dat, query); var map = {}; // :: Dict Int [Int] -- which nodes a node lives at // loop through each value in res, putting it into map (these are parents) // for any not present, descend through the dat.original list, if you aren't included, add, if you are included, skip var direct = {}; var ind = -1; for (var s in res) { ind++; var xs = res[s].items; for (var i = 0; i < xs.length; i++) direct[xs[i]] = ind; } function getDirect(key) { return key in direct ? [direct[key]] : []; } var indirect = {}; function getIndirect(key) { if (key in indirect) return indirect[key]; if (key in direct) return []; var ds = dat.original[key].depends; var res = []; for (var j = 0; j < ds.length; j++) { res.push(getIndirect(ds[j])); res.push(getDirect(ds[j])); } res = concatNub(res); indirect[key] = res; return res; } var ans = []; for (var s in res) { var xs = res[s].items; var ds = []; var is = []; for (var i = 0; i < xs.length; i++) { var depends = dat.original[xs[i]].depends; for (var j = 0; j < depends.length; j++) { ds.push(getDirect(depends[j])); is.push(getIndirect(depends[j])); } } ds = concatNub(ds); is = concatNub(is); ans.push({name:s, text:res[s].text, back:res[s].back, parents:ds, ancestors:is}); } return ans; } ///////////////////////////////////////////////////////////////////// // COMMANDS function commandFilter(last, dat, query) // DataEx -> Query -> Dict String [Trace] { queryData = dat; var f = readQuery(query); var res = {}; for (queryKey = 0; queryKey < dat.original.length; queryKey++) { queryVal = dat.original[queryKey]; if (last && queryVal.built !== 0) continue; var val = {}; for (var s in queryVal) val[s] = queryVal[s]; var ts = queryVal.traces || []; queryVal = val; queryName = queryVal.name; queryBackColor = null; queryTextColor = null; for (var i = 0; i < ts.length; i++) { queryVal.traces = [ts[i]]; queryGroup = null; if (f()) { if (queryGroup === null) queryGroup = ts[i].command; if (!(queryGroup in res)) res[queryGroup] = {items: [ts[i]], text:queryTextColor, back:queryBackColor}; else { var c = res[queryGroup]; c.items.push(ts[i]); c.text = colorAnd(c.text, queryTextColor); c.back = colorAnd(c.back, queryBackColor); } } } } return res; } function commandTable(dat, query) // DataEx -> Query -> [Record] { var res = commandFilter(false, dat, query); var ans = []; for (var s in res) { var xs = res[s].items; var time = 0; for (var i = 0; i < xs.length; i++) time += xs[i].stop - xs[i].start; ans.push({name:s, count:xs.length, text:res[s].text, back:res[s].back, time:time}); } return ans; } function commandPlot(dat, query, buckets) // DataEx -> Query -> Int -> Dict String [Double] { var end = dat.summary.maxTraceStopLast; var res = commandFilter(true, dat, query); var ans = {}; for (var s in res) { var ts = res[s].items; var xs = []; for (var i = 0; i <= buckets; i++) xs.push(0); // fill with 1 more element, but the last bucket will always be 0 for (var i = 0; i < ts.length; i++) { var start = ts[i].start * buckets / end; var stop = ts[i].stop * buckets / end; if (Math.floor(start) === Math.floor(stop)) xs[Math.floor(start)] += stop - start; else { for (var j = Math.ceil(start); j < Math.floor(stop); j++) xs[j]++; xs[Math.floor(start)] += Math.ceil(start) - start; xs[Math.floor(stop)] += stop - Math.floor(stop); } } ans[s] = {items: xs.slice(0,buckets), back: res[s].back || null}; } return ans; } ///////////////////////////////////////////////////////////////////// // ENVIRONMENT function readQuery(query) { var f; try { f = new Function("return " + (query === "" ? "true" : query)); } catch (e) { throw {user:true, name:"parse", query:query, message:e.toString()}; } return function(){ try { return f(); } catch (e) { throw {user:true, name:"execution", query:query, message:e.toString()}; } }; } // These are global variables mutated/queried by query execution var queryData = {}; var queryKey = 0; var queryVal = {}; var queryName = ""; var queryGroup = null; var queryBackColor = null; var queryTextColor = null; function childOf(r){return queryData.dependsOnThis(queryKey, r);} function parentOf(r){return queryData.thisDependsOn(queryKey, r);} function ancestorOf(r){return queryData.dependsOnThisTransitive(queryKey, r);} function descendantOf(r){return queryData.thisDependsOnTransitive(queryKey, r);} function descendentOf(r){return descendantOf(r);} function /* export */ group(x) { if (queryGroup === null) queryGroup = ""; queryGroup += (queryGroup === "" ? "" : " ") + x; return true; } function backColor(c, b) { if (b === undefined || b) queryBackColor = c; return true; } function textColor(c, b) { if (b === undefined || b) queryTextColor = c; return true; } function rename(from, to) { queryName = queryName.replace(from, to || ""); return true; } function slowestRule() { return queryData.summary.maxExecutionName; } function /* export */ leaf() { return queryVal.depends.length === 0; } function /* export */ run() { return queryVal.built; } function /* export */ unchanged() { return queryVal.changed !== queryVal.built; } function /* export */ name(r, groupName) { if (r === undefined) return queryName; var res = execRegExp(r, queryName); if (res === null) { if (groupName === undefined) return false; else { group(groupName); return true; } } if (res.length !== 1) { for (var i = 1; i < res.length; i++) group(res[i]); } return true; } function /* export */ command(r, groupName) { var n = (queryVal.traces || []).length; if (r === undefined) return n === 0 ? "" : queryVal.traces[0].command; for (var i = 0; i < n; i++) { var res = execRegExp(r, queryVal.traces[i].command); if (res === null) continue; if (res.length !== 1) { for (var j = 1; j < res.length; j++) group(res[j]); } return true; } if (groupName === undefined) return false; else { group(groupName); return true; } } shake-0.15.5/html/progress.html0000644000000000000000000000211512560222036014556 0ustar0000000000000000 Shake progress diagnostic

Shake Progress Analysis

The red line represents the real performance, with time to completion on the X axis and predicted time on the Y axis. A perfect progress prediction would follow the gray line.

Loading...

shake-0.15.5/html/profile.html0000644000000000000000000003220512560222036014355 0ustar0000000000000000 Shake report
Link

Loading...
shake-0.15.5/docs/0000755000000000000000000000000012560222036012011 5ustar0000000000000000shake-0.15.5/docs/Why.md0000644000000000000000000002353312560222036013110 0ustar0000000000000000# Why choose Shake? _See also: [Shake links](https://github.com/ndmitchell/shake#readme); [Shake manual](Manual.md#readme)_ Shake is a library for writing build systems. Most large projects have a custom-written build system, and developers working on the project are likely to run the build system many times a day, spending a noticeable amount of time [waiting for the build system](http://xkcd.com/303/). This document explains why you might pick Shake over alternative tools for writing build systems (e.g. make, Ant, Scons). Shake primarily benefits two groups of people: * **Developers running the build system** - Shake based build systems run quickly, require little manual intervention and report estimated completion time as they go. * **Developers writing the build system** - Shake provides a powerful language for writing build systems, has excellent support for writing large robust build systems, can express many types of build rules and provides profiling information to help speed up builds. In the rest of this document we explain and justify the above claims. Shake combines [cutting edge research](http://community.haskell.org/~ndm/downloads/paper-shake_before_building-10_sep_2012.pdf) with a [robust industrial-quality implementation](http://hackage.haskell.org/package/shake/). Shake is in constant use at many large organisations, including [a large investment bank](http://sc.com/), where it was originally developed and has been in use since 2009. #### Expresses many types of build rule Build systems run user supplied commands in an order satisfying dependencies. Many of the advantages of Shake are due to being able to express more powerful dependencies than other build tools. These dependency features ensure you can express the build system you want directly, without having to shoehorn your ideas into whatever dependencies your build tool provides. In particular, Shake can express both more dependencies (so things rebuild when they should) and more fine-grained dependencies (so things don't rebuild because something nearby changed). * Shake build systems can discover additional dependencies after running previous rules, allowing the build system to generate files and then examine them to determine their dependencies, rather than predict the dependencies in advance. Such capabilities are essential when working with generated source files, but often allow build systems to be structured more logically. * Most build systems only allow dependencies between files, but Shake provides user definable dependencies. By default Shake includes support for dependencies on files, the existence of files, environment variables, directory contents and several others, and adding more to your project is easy. In particular you can include dependencies on things like compiler versions or information stored on a remote machine. #### Build systems run quickly Developers are likely to spend a long time waiting for their build system, and consequently Shake is designed to be fast. * The Shake implementation itself is highly optimised, in common with many build tools. In particular, Shake is designed for especially fast execution when nothing has changed -- a common case when developing. * Shake benefits from its powerful dependencies, which can be more accurate and fine-grained, thus ensuring it only builds what is really necessary. * Shake has excellent support for parallelism, fully utilising multicore machines. Shake also supports resource constraints, allowing builds to run with a higher level of parallelism than would otherwise be possible. As an example, you can limit disk-intensive operations (e.g. linking) without restricting CPU-intensive operations (e.g. compiling). * Shake avoids rebuilding results where the dependencies are rebuilt but do not change, which is particularly useful for generated source files. The impact can reduce certain common patterns from build times of hours to build times of seconds. #### Build systems run reliably A build system is responsible for producing the files that developers are working with, so it is crucial that developers trust the result so they can properly investigate issues without considering build system involvement. * The powerful dependency system ensures that all dependencies can be expressed, ensuring the build never leaves stale files. * The Shake implementation itself has an extensive test suite, combining several examples projects and over 100 small unit tests (140 at the last count). In addition, a random build system generator allows extensive testing of key properties, including sufficient rebuilding and correctness in the presence of errors. * Shake builds can be run in a special "lint" mode to check global invariants, detecting and reporting problems such as dependency violations before they cause problems. #### Requires little manual intervention Most build systems occasionally require manual intervention, typically wiping the existing build and starting again, when the build system developers change something fundamental. Shake eliminates the need for any manual intervention, reducing time wasted by users of the build system. * The powerful dependencies ensure things that would normally require manual intervention can be tracked. For example, if the C compiler version is tracked as a dependency, then upgrading the C compiler will automatically rebuild all C files. * Shake includes a version with each script, which can be changed to automatically force a complete rebuild. #### Reports estimated completion time Shake can report estimated completion time, allowing developers to plan their time better. * Shake provides both predicted completion time (in minutes and seconds) and the percentage completed. All predictions are based on previously recorded execution times for rules and dynamic predictions of machine load, providing reasonable estimates. * Shake provides methods to display this information in the title bar on Windows, Linux and Mac, and on Windows 7/8 can also display this information as a progress bar in the taskbar. * The progress information can be easily integrated into continuous integration systems, such as Team City. #### Powerful language Shake is implemented as a Haskell library, and Shake build systems are structured as Haskell programs which make heavy use of the Shake library functions. Shake is a delicate balance, providing access to the full power of Haskell (so build systems are not limited), yet also not requiring Haskell knowledge (suitable for any programmer). * By building on top of Haskell, Shake build systems benefit from a powerful standardised language. Having a full language available ensures that anything that would be unsuitable to express in a build system can be implemented in Haskell and used seamlessly. * While Shake build systems are Haskell programs, they can be treated as a powerful version of make with slightly funny syntax. The build system requires no significant Haskell knowledge, and is designed so that most features are accessible by learning the "Shake syntax", without any appreciation of what the underlying Haskell means. #### Supports large robust systems Shake build systems can scale to tens of thousands of lines without becoming unwieldy. * Shake uses Haskell to provide facilities for properly structuring large projects. In particular, Shake build systems can use functions to reuse common functionality, modules to group functions into separate files and packages to allow reusing and sharing modules. * The types and utility functions provided by Shake eliminate certain classes of common error, making it harder express invalid build systems. * The lint mode performs sanity checks of the build system, allowing errors to be caught sooner. #### Provides profiling information Shake can generate profiling information allowing developers to both understand the current system, and identify opportunities for improvement. * The Shake profiling reports are standalone web pages containing plots, tables and graphs of useful information. * The report can be used to speed up by the build by identifying which commands are most expensive, which files cause most rebuilding and any bottlenecks in parallelism. * The report can examine details of the last run, providing information about what built and why. * Profiles are always recorded, allowing profile reports to be generated after a run completes, without requesting any "profiling mode" in advance. Shake ensures profiling information is recorded with no measurable performance impact. * Graphs can be generating showing dependencies, usually grouped by either file type or location, making it easy to see the overall structure of the build. * Reports can be mined using a powerful querying language to determine custom information about your build. #### Why not? This document sets out the reasons you _should_ use Shake, but naturally there are some disadvantages: * Shake build systems are written in Haskell. While that provides many benefits, it does mean the Shake syntax follows that of Haskell, and some Shake errors are reported by the compiler as Haskell type errors. Despite being written in Haskell, the [user manual](Manual.md#readme) introduces Shake assuming no Haskell knowledge, so Haskell knowledge is not a requirement and hopefully should not be a barrier to using Shake. * Shake is not likely to be installed by default, while make almost always is. * Shake does not provide default build rules, requiring everything to be expressed in your build system. In practice it seems that the default rules included with make are unsuitable for most large scale projects, but smaller projects may find themselves writing a few additional rules. It is hoped that additional experience with Shake will lead to a library of build rules being developed. #### Feedback Do you have any comments on this document? Any questions about Shake? Drop an email to the [mailing list](https://groups.google.com/forum/?fromgroups#!forum/shake-build-system). shake-0.15.5/docs/shake-progress.png0000644000000000000000000002532712560222036015465 0ustar0000000000000000PNG  IHDR%( pHYsodtIME 1&a IDATxy$GyGUf33[3#t" !6k>k/ v1ػ`ls c`0t tKHhf4==}TueVUdWdVUwhF Ɍ㋈7"K>q^!_! 9%χ0$aL{|O6NF0/0I 7u2L~+hH&x~Zϼ?AboARKғʾMcVRN.i0X=δ#~CXg ;xArv^y}AgGzNs:[] HAP7yo#$u 6||dsc~ctsڼ?6yy^7ޜ~X<9D CDOJ$$tp$Hwq8$iCq\'_"Ñ>җRHPB$๒P:@{2D&礔x#Afu.A"4#(nHu I#DJ7'υH|H<_2ElkO&u!Eb) bmq+BЯC &Dq}~%L/A;iNړH¤'x"JQ܋rH$^l7>N<\ϰeR_) J#iLJ%H'ӏHJ]#%RzՌKb>7Ǐd2># ^HP֑xIN_$s?pD"]'s\`9wj'bې/k67'6&p|blclg2&6K17=^L!ytzl'p 003tKH#ggxD#2{f` IO0HDt'@tP0(ua/K=3u#^6`L˶tJB?IFWm.Sf~9 tMOXaI.+Hk:&i醄%rv0RP/IO&b p),.`O*`R ;丗M̭+ JsÀ|xY7'qh7ʹ920dn"*LUk 1ID mJmk*1Lvn8}֯fʅL$2scH/䮌eiOkPALN$Z*!Ș>:Id@&R0 tc9&ɘ5<RXt4I$T$H$Q%*d$@6a"cQz,Kߗ%pann/>A\pX"Ŕ<̤?q}abD0ⲝܾSH%₤Lb"^A2>#eB㺅L|8Al{*%M+st72vu,M$] R@KX*&vwHr2]eTȤ(A"0trIJ/CK“ul{z"ycYd< NH4XyLqA} r$zœ.VC&EX_fp,#pʚyV%O\fș.^DI}iK`m_w a.9S#vwm՗n_B_+4VV;BF U±-*Mŭ)wrϮ|XhPZ^M_Ę"$4&F >NHHT)klȆPnhNB*_d q] iK=_2V2 t2!p@޹8x=)l|:I<̼K}Y%[R| 3+ԙ. d>'K_i+;v0M 9,RK'_/eQޓB1gz OJ+=BXŽ1a/^b"u4ј|>3Hx  .E6(0/gȳhohD^Ҟ /;n 4zrVZU1eȖ=S//~sufCZ2BJMJkt)hF!B` mY8,S7}_9ۉerrK"fw-!<Ҕ*áƪםS8e dS˛䌍A'S03cSF>ō(V66ft'- :zefte7 !!5g,..l%iF C+1B8|Ϧ򨴡<3WoR {˙V:"%R0-iZuuP84~-gmXɁ)I' I]$IX 010xl$C7ޅEwG8VRǎͣ17${q`ҋ-Hzw/JCc"Y/^aYTIM4$4PGꢏo /~sl5VOo849K%iKm)if)%JklqTdCrζXU֮f/~w2AFѢQ,`[ f/ x[^y ]B2iG=S](L-zRL,VRT(5qjY(*ׅ<*Fo{;@I,Jt%`6^qbfko5-] } ^u 8k O4 5Ffrz)hđg`DzB1k~e~zpN}ve zL7#h4,qK. D䪔uQJ=Piki$&2)iݸO5$L,u;/O+ zX}mDI"<ӶӾHAU(Q .iS`trG>š uŰE|EJJ)fNMP{'Fg{kW6_e Z1hJTj5qʻov׷p-@g}}e$G`-ǂcMܸCkOvjNue&04[#-Ϥs%ITZ5 Y'ADg]QsمTJǓ:* Dw\1uZL6EX"T{MMM茹(bߙ塺ֽ"*ob(ۈVD(Oz|՝18Q`t QmlZoPoG8|HВD :~` uwp*k!,l7{ov#!hT~lۉJITВ~ߣ yK.᝟ܵMMhQ`ե}x5 2@&eP;(0@Ei0 3gS}]Ü8Ɋ(:z >e(5s:&a)iE%`1_~ܡ:XXx+5Ne ~=mrZ@+h=?Ino߯r9o|ʖhQ(=(q](XNW@)<*RQFhK&_6*2%%ҪhdQQnY9kdD&jٛ6х)e>/ݝi P2˙U.o%(>|ӽ|J=Q X`eAdIBxEJ1(-q㻙o#QB`׎V[npqY8"2Nƒ;x&m%_*84|+90&2,%eD?Cc.:1+䁌IDJ-.( *K9 ѭ,֬~d݊*&9:[g>@۱nqNp[RYn,pv GsrfͦתSqY yM%/qjLժ`906+q&|s B7 # .,'{0 "!L1]Pg1!SdJ/ǔo&*J%P#32@u0ml,O :l:[(n)۠x-C+O$aY'~< YoDueC쟈8ojORo4o+"I&lkʰ=$p0Q| (V nk,8W-m)hE`-enr`|f, JZ~mD4le)|3^9IMY˱JI+(ïO@z:uT_Ja|WQW)"95Fo0ČÿW戈HXTqhTzc!,f̡A6CPj"K4(B4"'"ҒJRqJUxEG-R8*.[zK>NehVH16Wʋ-㣟Rp+".\ٹXxZK%q˹tͥoO*oćw5ZpF89~o \Ñ#/~׿?L|o[ۇC (rBlH%1%d:DO`rbU*4L4aٴT|[jڊіŻx.̯Uĭz`ijGv12₫8crldTŵ=\T\&q4w6Y6c39`z>«TcҤl$`!ԢԦ  k)kj77lnx.p"+[/rJϿHG4e@̷\nMz7\\`<cJNfy| UFx}w>񚍯53?+ƃG5_Go 5/cY;N_|upb(G>>l R2`@l^*:ؖ.Zɣne! LF+eY9_t\TģGK*\dԦenM( ZZM>2rףtfF5u.>V y+/gcZk*V'Bkž`-D}^:rN+Y |`{ssrn?;jLMřRQezG/(l99pI;.͓GS}Xe8ZQ}k ~™o%5[ݫ1z-ش#[ZšYG~Za VH9hpdb ^s hmFҒO)>#l7:kp+k8X; r<4Yw '\|=X>oc5%kbݟv.K;`o#_̴ߥ.k-o㫇L{f-i!h|op'x)ow7܍si>/ ^E5 J]YOIDUnnUؗ91L$Yl }g):jq. JR8MVAI〝|  Qo(?i o?LotC`fn39ٰ{%8"lA+,DZKjSd964Za.Jѓ(q̷ 9n·1ڿ3?@a[6-[Vt%לjn^_皳_<@@Gӊdcdtyts3? sUF3e>VQ?8Q{`I4 {Bqfg͐azB BĒ툩bH*Eǎ{ys#՞y[m8Uȣ>\ZSAGQ~V\PY=4k%-ǦAH* ƿ`nEEx*y Wmy!7kvW07 d+Y{6]hz[r8t;~mܸܸ3\wƵ\n_=p jIDATO\Joe.޳xOٺjEh`G8׬>B+Cwo=iaWxYb[WS;I4wt/\SV 3`XhŠW]TmlI7mm 7Yf-^|nA9DTGEH5B< 9HGoiTC%RPNJZ1.NVlbz>9Za+:D#+hWVq9686S9ueVt5 jZDZCM `.ՌHPfJ#{z7N'퍊VyRXIZz/N٧ԍ >ʏf(v7Z ~_\Xc!ϊ2|ӿiNw=^W=;zլ_W^njm>.9[~Bf{{b ?yםF>LyKXܒç-7>5wpλ OhM4uY9ӹyq)v6{KX{ /O;Ყ,2Ȩƪ*bOף6eQ!%vbRւihMSUMn!t5P5Ms͏N] K +cp|I#G-H (^rnzUQj|}n?kr?kFۊv5koyq(?ƵNs1 r6qsG]M]W=o=- C0W ^()NPҋ-Kxem#9R2?:ɥ'PDMhWFF WO6{в 8V 5.9 HcvG7,K%P.GXB) 7)T| ,^dGsۑ[b 7w]|dtk4uͳjgJE 0.P̗[ |:s^˚5-_4U};pAXD51<9iT[BM(T<ncB aUV,# ׳Ẕa9N%QLIě >rC:v :l+qt?Y;KqKo?m B ?50Fhv]mG"M4ʽkyцTVB4<N^+B bY.Wl; ?1i矔[]6KުH qxv݇v37_;vl/Yx+33jJ~ $҉$vOp: [BL%ۼӸgKt_7q,DFWltFUlglEˊJ#6:$#{Y\t};z^AuQpQ<-4Mzf>scϾ?u` k3@-?Z/m6OkӒ8krhA#Y=s Niq`!NxN )^UB/g WBǺBFm&.Ĵ^X7_i,! [ny,aR饾6D?cu }+ټ&E3-ݺ@D톚Hh~[B, ?>#GlIENDB`shake-0.15.5/docs/Ninja.md0000644000000000000000000001074512560222036013401 0ustar0000000000000000# Running Ninja builds with Shake _See also: [Shake links](https://github.com/ndmitchell/shake#readme); [Shake manual](Manual.md#readme)_ Shake supports the .ninja file format used by the [Ninja tool](http://martine.github.io/ninja/). This document describes how to use Shake instead of Ninja, and why you might want to do so. #### Installing Shake 1. Install the [Haskell Platform](http://www.haskell.org/platform/), which provides a Haskell compiler and standard libraries. 2. Type `cabal update`, to download information about the latest versions of all Haskell packages. 3. Type `cabal install shake --global`, to build and install Shake and all its dependencies. 4. Type `shake --help`, to verify that Shake was successfully installed. #### Running Shake Change to the directory you usually run Ninja from (where there is a `build.ninja` file) and instead of typing `ninja` type `shake`. Ninja defaults to guessing how many processors to use, while Shake defaults to only 1 processor, so you will probably want to run something like `shake -j4` to use 4 processors (with a number appropriate for your computer). The following Ninja options are available in Shake: * Print version is `--version` in Shake. * Change directory before building is `--directory` in Shake. * Specify the .ninja file is `--file` in Shake. * Parallelism is `-j` in Shake. * Avoiding starting new jobs if the load average is over a certain level is not currently supported in Shake. * Keep going until a number of jobs fail is best approximated by `--keep-going` which keeps going regardless of how many jobs fail. * Dry run is not supported in Shake. * Show command lines while building is `--verbose` in Shake. * Debugging stats is `--timings` in Shake. * Debugging explanations are achieved with `--debug` (mostly for Shake developers) and `--report` for end users. * Many of the Ninja subtools have equivalent versions inside `--report`. #### Additional features of Shake For people who are set up to run an existing .ninja build file, there are two features of Shake that may appeal: * If you build with `--report` the file `report.html` will be generated. Open that report file and you can see numerous details about the build - how good the parallel utilisation was, what changed to cause what to rebuild, summary statistics, a dependency graph and more. See the Help page in any generated report for more details. * If you build with `--progress` the console titlebar will display a predicted completion time, how many seconds until your build completes. The predicted time will be fairly inaccurate the first time round, but future runs are influenced by recorded timings, and can produce useful guesses. * If you build with `--lint` certain invariants will be checked, and an error will be raised if they are violated. For example, if you depend on a generated file via `depfile`, but do not list it as a dependency (even an order only dependency), an error will be raised. #### FAQ * If I get this working, or can't get it working because of a bug, do you care? Yes - please email the [mailing list](https://groups.google.com/forum/?fromgroups#!forum/shake-build-system). * Is Shake compatible with all Ninja features? Shake has support for everything in the Ninja manual - including response files, deps, pools and restat. Shake does not yet support rebuilding a file if the command line changes (if people rely on this feature, I am happy to add it). I am unaware of any Ninja files that don't work, but would be surprised if there were not some corner cases that Shake gets wrong (but tell me, and I'll fix it). * Is Shake faster or slower than Ninja? I have one data point - compiling LLVM on Windows under mingw they both take the same time to compile initially, and Ninja takes 0.9s for a nothing to do build vs Shake at 0.8s. Shake is slower at parsing Ninja files, so if you have _huge_ .ninja files (e.g. Chromium) Shake will probably be slower. Shake does less work if you don't specify deps, which is probably why it is faster on LLVM (but you should specify deps - it makes both Shake and Ninja faster). As people report more results I am sure both Shake and Ninja will be optimised. * Why did you make Shake interpret .ninja files? There are a few reasons: 1) It seemed like fun. 2) The Ninja team have made both [CMake](http://www.cmake.org/) and [gyp](https://code.google.com/p/gyp/) both generate .ninja files, so Shake can now build far more projects. 3) Shake and Ninja are both designed to be fast, benchmarking them has already improved the speed of Shake. shake-0.15.5/docs/Manual.md0000644000000000000000000010142412560222036013552 0ustar0000000000000000# Shake Manual _See also: [Shake links](https://github.com/ndmitchell/shake#readme); [Why choose Shake](Why.md#readme); [Function documentation](http://hackage.haskell.org/packages/archive/shake/latest/doc/html/Development-Shake.html)_ Shake is a Haskell library for writing build systems - designed as a replacement for `make`. This document describes how to get started with Shake, assuming no prior Haskell knowledge. First, let's take a look at a Shake build system: import Development.Shake import Development.Shake.Command import Development.Shake.FilePath import Development.Shake.Util main :: IO () main = shakeArgs shakeOptions{shakeFiles="_build"} $ do want ["_build/run" <.> exe] phony "clean" $ do putNormal "Cleaning files in _build" removeFilesAfter "_build" ["//*"] "_build/run" <.> exe %> \out -> do cs <- getDirectoryFiles "" ["//*.c"] let os = ["_build" c -<.> "o" | c <- cs] need os cmd "gcc -o" [out] os "_build//*.o" %> \out -> do let c = dropDirectory1 $ out -<.> "c" let m = out -<.> "m" () <- cmd "gcc -c" [c] "-o" [out] "-MMD -MF" [m] needMakefileDependencies m This build system builds the executable `_build/run` from all C source files in the current directory. It will rebuild if you add/remove any C files to the directory, if the C files themselves change, or if any headers used by the C files change. All generated files are placed in `_build`, and a `clean` command is provided that will wipe all the generated files. In the rest of this manual we'll explain how the above code works and how to extend it. #### Running this example To run the example above: 1. Install the [Haskell Platform](http://www.haskell.org/platform/), which provides a Haskell compiler and standard libraries. 2. Type `cabal update`, to download information about the latest versions of all Haskell packages. 3. Type `cabal install shake`, to build and install Shake and all its dependencies. 4. Type `shake --demo`, which will create a directory containing a sample project, the above Shake script (named `Build.hs`), and execute it (which can be done by `runhaskell Build.hs`). For more details see a [trace of `shake --demo`](Demo.md). ## Basic syntax This section explains enough syntax to write a basic Shake build script. #### Boilerplate The build system above starts with the following boilerplate:
import Development.Shake
import Development.Shake.Command
import Development.Shake.FilePath
import Development.Shake.Util
 
main :: IO ()
main = shakeArgs shakeOptions{shakeFiles="_build"} $ do
    build rules
All the interesting build-specific code is placed under build rules. Many build systems will be able to reuse that boilerplate unmodified. #### Defining targets A target is a file we want the build system to produce (typically executable files). For example, if we want to produce the file `manual/examples.txt` we can write: want ["manual/examples.txt"] The `want` function takes a list of strings. In Shake lists are written `[item1,item2,item2]` and strings are written `"contents of a string"`. Special characters in strings can be escaped using `\` (e.g. `"\n"` for newline) and directory separators are always written `/`, even on Windows. Most files have the same name on all platforms, but executable files on Windows usually have the `.exe` extension, while on POSIX they have no extension. When writing cross-platform build systems (like the initial example), we can write: want ["_build/run" <.> exe] The `<.>` function adds an extension to a file path, and the built-in `exe` variable evaluates to `"exe"` on Windows and `""` otherwise. #### Defining rules A rule describes the steps required to build a file. A rule has two components, a pattern and some actions:
pattern %> \out -> do
    actions
The pattern is a string saying which files this rule can build. It may be a specific file (e.g. `"manual/examples.txt" %> ...`) or may use wildcards: * The `*` wildcard matches anything apart from a directory separator. For example `"manual/*.txt"` would define a rule for any `.txt` file in the `manual` directory, including `manual/examples.txt`, but would not match `manual/examples.zip`, `examples.txt` or `manual/docs/examples.txt`. * The `//` wildcard matches any number of complete path components. For example `//*.txt` would define a rule for any `.txt` file, including `manual/examples.txt`. As another example, `manual//examples.txt` would match any file named `examples.txt` inside `manual`, including both `manual/examples.txt` and `manual/docs/examples.txt`. It is an error for multiple patterns to match a file being built, so you should keep patterns minimal. Looking at the two rules in the initial example: "_build/run" <.> exe %> ... "_build//*.o" %> ... The first matches only the `run` executable, using `<.> exe` to ensure the executable is correctly named on all platforms. The second matches any `.o` file anywhere under `_build`. As examples, `_build/main.o` and `_build/foo/bar.o` both match while `main.o` and `_build/main.txt` do not. Lots of compilers produce `.o` files, so if you are combining two different languages, say C and Haskell, use the extension `.c.o` and `.hs.o` to avoid overlapping rules. The actions are a list of steps to perform and are listed one per line, indented beneath the rule. Actions both express dependencies (say what this rule uses) and run commands (actually generate the file). During the action the `out` variable is bound to the file that is being produced. #### A simple rule Let's look at a simple example of a rule: "*.rot13" %> \out -> do let src = out -<.> "txt" need [src] cmd "rot13" src "-o" out This rule can build any `.rot13` file. Imagine we are building `"file.rot13"`, it proceeds by: * Using `let` to define a local variable `src`, using the `-<.>` extension replacement method, which removes the extension from a file and adds a new extension. When `out` is `"file.rot13"` the variable `src` will become `file.txt`. * Using `need` to introduce a dependency on the `src` file, ensuring that if `src` changes then `out` will be rebuilt and that `src` will be up-to-date before any further commands are run. * Using `cmd` to run the command line `rot13 file.txt -o file.rot13`, which should read `file.txt` and write out `file.rot13` being the ROT13 encoding of the file. Many rules follow this pattern - calculate some local variables, `need` some dependencies, then use `cmd` to perform some actions. We now discuss each of the three statements. #### Local variables Local variables can be defined as:
let variable = expression
Where variable is a name consisting of letters, numbers and underscores (a-z, A-Z, 0-9 and \_). All variables _must_ start with a lower-case letter. An expression is any combination of variables and function calls, for example `out -<.> "txt"`. A list of some common functions is discussed later. Variables are evaluated by substituting the expression everywhere the variable is used. In the simple example we could have equivalently written: "*.rot13" %> \out -> do need [out -<.> "txt"] cmd "rot13" (out -<.> "txt") "-o" out Variables are local to the rule they are defined in, cannot be modified, and should not be defined multiple times within a single rule. #### File dependencies You can express a dependency on a file with: need ["file.src"] To depend on multiple files you can write: need ["file.1","file.2"] Or alternatively: need ["file.1"] need ["file.2"] It is preferable to use fewer calls to `need`, if possible, as multiple files required by a `need` can be built in parallel. #### Running external commands The `cmd` function allows you to call system commands, e.g. `gcc`. Taking the initial example, we see: cmd "gcc -o" [out] os After substituting `out` (a string variable) and `os` (a list of strings variable) we might get: cmd "gcc -o" ["_make/run"] ["_build/main.o","_build/constants.o"] The `cmd` function takes any number of space-separated expressions. Each expression can be either a string (which is treated as a space-separated list of arguments) or a list of strings (which is treated as a direct list of arguments). Therefore the above command line is equivalent to either of: cmd "gcc -o _make/run _build/main.o _build/constants.o" cmd ["gcc","-o","_make/run","_build/main.o","_build/constants.o"] To properly handle unknown string variables it is recommended to enclose them in a list, e.g. `[out]`, so that even if `out` contains a space it will be treated as a single argument. The `cmd` function as presented here will fail if the system command returns a non-zero exit code, but see later for how to treat failing commands differently. As a wart, if the `cmd` call is _not_ the last line of a rule, you must precede it with `() <- cmd ...`. #### Filepath manipulation functions Shake provides a complete library of filepath manipulation functions (see the [docs for `Development.Shake.FilePath`](https://hackage.haskell.org/package/shake/docs/Development-Shake-FilePath.html)), but the most common are: * `str1 str2` - add the path components together with a slash, e.g. `"_build" "main.o"` equals `"_build/main.o"`. * `str1 <.> str2` - add an extension, e.g. `"main" <.> "o"` equals `"main.o"`. * `str1 ++ str2` - append two strings together, e.g. `"hello" ++ "world"` equals `"helloworld"`. * `str1 -<.> str2` - replace an extension, e.g. `"main.c" -<.> "o"` equals `"main.o"`. * `dropExtension str` - drop the final extension of a filepath if it has one, e.g. `dropExtension "main.o"` equals `"main"`, while `dropExtension "main"` equals `"main"`. * `takeFileName str` - drop the path component, e.g. `takeFileName "_build/src/main.o"` equals `"main.o"`. * `dropDirectory1 str` - drop the first path component, e.g. `dropDirectory1 "_build/src/main.o"` equals `"src/main.o"`. ## Advanced Syntax The following section covers more advanced operations that are necessary for moderately complex build systems, but not simple ones. #### Directory listing dependencies The function `getDirectoryFiles` can retrieve a list of files within a directory: files <- getDirectoryFiles "" ["//*.c"] After this operation `files` will be a variable containing all the files matching the pattern `"//*.c"` (those with the extension `.c`) starting at the directory `""` (the current directory). To obtain all `.c` and `.cpp` files in the src directory we can write: files <- getDirectoryFiles "src" ["//*.c","//*.cpp"] The `getDirectoryFiles` operation is tracked by the build system, so if the files in a directory changes the rule will rebuild in the next run. You should only use `getDirectoryFiles` on source files, not files that are generated by the build system, otherwise the results will change while you are running the build and the build may be inconsistent. #### List manipulations Many functions work with lists of values. The simplest operation on lists is to join two lists together, which we do with `++`. For example, `["main.c"] ++ ["constants.c"]` equals `["main.c","constants.c"]`. Using a _list comprehension_ we can produce new lists, apply functions to the elements and filtering them. As an example: ["_build" x -<.> "o" | x <- inputs] This expression grabs each element from `inputs` and names it `x` (the `x <- inputs`, pronounced "`x` is drawn from `inputs`"), then applies the expression `"_build" x -<.> "o"` to each element. If we start with the list `["main.c","constants.c"]`, we would end up with `["_build/main.o","_build/constants.o"]`. List expressions also allow us to filter the list, for example we could know that the file `"evil.c"` is in the directory, but should not be compiled. We can extend that to: ["_build" x -<.> "o" | x <- inputs, x /= "evil.c"] The `/=` operator checks for inequality, and any predicate after the drawn from is used to first restrict which elements of the list are available. #### Using `gcc` to collect headers One common problem when building `.c` files is tracking down which headers they transitively import, and thus must be added as a dependency. We can solve this problem by asking `gcc` to create a file while building that contains a list of all the imports. If we run: gcc -c main.c -o main.o -MMD -MF main.m That will compile `main.c` to `main.o`, and also produce a file `main.m` containing the dependencies. To add these dependencies as dependencies of this rule we can call: needMakefileDependencies "main.m" Now, if either `main.c` or any headers transitively imported by `main.c` change, the file will be rebuilt. In the initial example the complete rule is: "_build//*.o" %> \out -> do let c = dropDirectory1 $ out -<.> "c" let m = out -<.> "m" () <- cmd "gcc -c" [c] "-o" [out] "-MMD -MF" [m] needMakefileDependencies m We first compute the source file `c` (e.g. `"main.c"`) that is associated with the `out` file (e.g. `"_build/main.o"`). We then compute a temporary file `m` to write the dependencies to (e.g. `"_build/main.m"`). We then call `gcc` using the `-MMD -MF` flags and then finally call `needMakefileDependencies`. #### Top-level variables Variables local to a rule are defined using `let`, but you can also define top-level variables. Top-level variables are defined before the `main` call, for example: buildDir = "_build" You can now use `buildDir` in place of `"_build"` throughout. You can also define parametrised variables (functions) by adding argument names: buildDir x = "_build" x We can now write: buildDir ("run" <.> exe) %> \out -> do ... All top-level variables and functions can be though of as being expanded wherever they are used, although in practice may have their evaluation shared. #### A clean command A standard clean command is defined as: phony "clean" $ do putNormal "Cleaning files in _build" removeFilesAfter "_build" ["//*"] Running the build system with the `clean` argument, e.g. `runhaskell Build.hs clean` will remove all files under the `_build` directory. This clean command is formed from two separate pieces. Firstly, we can define `phony` commands as:
phony "name" $ do
    actions
Where name is the name used on the command line to invoke the actions, and actions are the list of things to do in response. These names are not dependency tracked and are simply run afresh each time they are requested. The actions can be any standard build actions, although for a `clean` rule, `removeFilesAfter` is typical. This function waits until after any files have finished building (which will be none, if you do `runhaskell Build.hs clean`) then deletes all files matching `//*` in the `_build` directory. The `putNormal` function writes out a message to the console, as long as `--quiet` was not passed. ## Running This section covers how to run the build system you have written. #### Compiling the build system As shown before, we can use `runhaskell Build.hs` to execute our build system, but doing so causes the build script to be compiled afresh each time. A more common approach is to add a shell script that compiles the build system and runs it. In the example directory you will find `build.sh` (Linux) and `build.bat` (Windows), both of which execute the same interesting commands. Looking at `build.sh`: #!/bin/sh mkdir -p _shake ghc --make Build.hs -rtsopts -with-rtsopts=-I0 -outputdir=_shake -o _shake/build && _shake/build "$@" This script creates a folder named `_shake` for the build system objects to live in, then runs `ghc --make Build.hs` to produce `_shake/build`, then executes `_shake/build` with all arguments it was given. The `-with-rtsopts` flag can be treated as magic - it instructs the Haskell compiler to turn off features that would otherwise steal CPU from the commands you are running. Now you can run a build by simply typing `./build.sh` on Linux, or `build` on Windows. On Linux you may want to alias `build` to `./build.sh`. For the rest of this document we will assume `build` runs the build system. _Warning:_ You should not use the `-threaded` for GHC 7.6 or below because of a [GHC bug](https://ghc.haskell.org/trac/ghc/ticket/7646). If you do turn on `-threaded`, you should include `-qg -qb` in `-with-rtsopts`. #### Command line flags The initial example build system supports a number of command line flags, including: * `build` will compile all files required by `want`. * `build _build/main.o` will compile enough to create `_build/main.o`, ignoring all `want` requirements. * `build clean` will delete the contents of `_build`, because of our `phony` command. * `build --help` will list out all flags supported by the build system, currently 36 flags. Most flags supported by `make` are also supported by Shake based build systems. * `build -j8` will compile up to 8 rules simultaneously, by default Shake uses 1 processor. Most flags can also be set within the program by modifying the `shakeOptions` value. As an example, `build --metadata=_metadata` causes all Shake metadata files to be stored with names such as `_metadata/.shake.database`. Alternatively we can write `shakeOptions{shakeFiles="_metadata"}` instead of our existing `shakeFiles="_build"`. Values passed on the command line take preference over those given by `shakeOptions`. Multiple overrides can be given to `shakeOptions` by separating them with a comma, for example `shakeOptions{shakeFiles="_build",shakeThreads=8}`. #### Progress prediction One useful feature of Shake is that it can predict the remaining build time, based on how long previous builds have taken. The number is only a prediction, but it does take account of which files require rebuilding, how fast your machine is currently running, parallelism settings etc. You can display progress messages in the titlebar of a Window by either: * Running `build --progress` * Setting `shakeOptions{shakeProgress=progressSimple}` The progress message will be displayed in the titlebar of the window, for example `3m12s (82%)` to indicate that the build is 82% complete and is predicted to take a further 3 minutes and 12 seconds. If you are running Windows 7 or higher and place the [`shake-progress`](http://github.org/ndmitchell/shake) utility somewhere on your `%PATH%` then the progress will also be displayed in the taskbar progress indicator: ![](shake-progress.png) Progress prediction is likely to be relatively poor during the first build and after running `build clean`, as then Shake has no information about the predicted execution time for each rule. To rebuild from scratch without running clean (because you really want to see the progress bar!) you can use the argument `--always-make`, which assumes all rules need rerunning. #### Lint Shake features a built in "lint" features to check the build system is well formed. To run use `build --lint`. You are likely to catch more lint violations if you first `build clean`. Sadly, lint does _not_ catch missing dependencies. However, it does catch: * Changing the current directory, typically with `setCurrentDirectory`. You should never change the current directory within the build system as multiple rules running at the same time share the current directory. You can still run `cmd` calls in different directories using the `Cwd` argument. * Outputs that change after Shake has built them. The usual cause of this error is if the rule for `foo` also writes to the file `bar`, despite `bar` having a different rule producing it. There is a performance penalty for building with `--lint`, but it is typically small. #### Profiling Shake features an advanced profiling feature. To build with profiling run `build --report`, which will generate an interactive HTML profile named `report.html`. This report lets you examine what happened in that run, what takes most time to run, what rules depend on what etc. There is a help page included in the profiling output, and a [profiling tutorial/demo](https://cdn.rawgit.com/ndmitchell/shake/35fbe03c8d3bafeae17b58af89497ff3fdd54b22/html/demo.html). To view profiling information for the _previous_ build, you can run `build --no-build --report`. This feature is useful if you have a build execution where a file unexpectedly rebuilds, you can generate a profiling report afterwards and see why. To generate a lightweight report (about 5 lines) printed to the console run `build --report=-`. #### Tracing and debugging To debug a build system there are a variety of techniques that can be used: * Run with lint checking enabled (`--lint`), which may spot and describe the problem for you. * Run in single-threaded mode (`-j1`) to make any output clearer by not interleaving commands. * By default a Shake build system prints out a message every time it runs a command. Use verbose mode (`--verbose`) to print more information to the screen, such as which rule is being run. Additional `--verbose` flags increase the verbosity. Three verbosity flags produce output intended for someone debugging the Shake library itself, rather than a build system based on it. * To raise a build error call `error "error message"`. Shake will abort, showing the error message. * To output additional information use `putNormal "output message"`. This message will be printed to the console when it is reached. * To show additional information with either `error` or `putNormal`, use `error $ show ("message", myVariable)`. This allows you to show any local variables. ## Extensions This section details a number of build system features that are useful in some build systems, but not the initial example, and not most average build systems. #### Advanced `cmd` usage The `cmd` function can also obtain the stdout and stderr streams, along with the exit code. As an example: (Exit code, Stdout out, Stderr err) <- cmd "gcc --version" Now the variable `code` is bound to the exit code, while `out` and `err` are bound to the stdout and stderr streams. If `ExitCode` is not requested then any non-zero return value will raise an error. The `cmd` function also takes additional parameters to control how the command is run. As an example: cmd Shell (Cwd "temp") "pwd" This runs the `pwd` command through the system shell, after first changing to the `temp` directory. #### Dependencies on environment variables You can use tracked dependencies on environment variables using the `getEnv` function. As an example: link <- getEnv "C_LINK_FLAGS" let linkFlags = fromMaybe "" link cmd "gcc -o" [output] inputs linkFlags This example gets the `$C_LINK_FLAGS` environment variable (which is `Maybe String`, namely a `String` that might be missing), then using `fromMaybe` defines a local variable `linkFlags` that is the empty string when `$C_LINK_FLAGS` is not set. It then passes these flags to `gcc`. If the `$C_LINK_FLAGS` environment variable changes then this rule will rebuild. #### Dependencies on extra information Using Shake we can depend on arbitrary extra information, such as the version of `gcc`, allowing us to automatically rebuild all C files when a different compiler is placed on the path. To track the version, we can define a rule for the file `gcc.version` which changes only when `gcc --version` changes: "gcc.version" %> \out -> do alwaysRerun Stdout stdout <- cmd "gcc --version" writeFileChanged out stdout This rule has the action `alwaysRerun` meaning it will be run in every execution that requires it, so the `gcc --version` is always checked. This rule defines no dependencies (no `need` actions), so if it lacked `alwaysRerun`, this rule would only be run when `gcc.version` was missing. The function then runs `gcc --version` storing the output in `stdout`. Finally, it calls `writeFileChanged` which writes `stdout` to `out`, but only if the contents have changed. The use of `writeFileChanged` is important otherwise `gcc.version` would change in every run. To use this rule, we `need ["gcc.version"]` in every rule that calls `gcc`. Shake also contains a feature called "oracles", which lets you do the same thing without the use of a file, which is sometimes more convenient. Interested readers should look at the function documentation list for `addOracle`. #### Resources Resources allow us to limit the number of simultaneous operations more precisely than just the number of simultaneous jobs (the `-j` flag). For example, calls to compilers are usually CPU bound but calls to linkers are usually disk bound. Running 8 linkers will often cause an 8 CPU system to grid to a halt. We can limit ourselves to 4 linkers with: disk <- newResource "Disk" 4 want [show i <.> "exe" | i <- [1..100]] "*.exe" %> \out -> do withResource disk 1 $ do cmd "ld -o" [out] ... "*.o" %> \out -> do cmd "cl -o" [out] ... Assuming `-j8`, this allows up to 8 compilers, but only a maximum of 4 linkers. #### Multiple outputs Some tools, for example [bison](http://www.gnu.org/software/bison/), can generate multiple outputs from one execution. We can track these in Shake using the `&%>` operator to define rules: ["//*.bison.h","//*.bison.c"] &%> \[outh, outc] -> do let src = outc -<.> "y" cmd "bison -d -o" [outc] [src] Now we define a list of patterns that are matched, and get a list of output files. If any output file is required, then all output files will be built, with proper dependencies. #### Changing build rules Shake build systems are set up to rebuild files when the dependencies change, but mostly assume that the build rules themselves do not change. To minimise the impact of build rule changes there are three approaches: _Use configuration files:_ Most build information, such as which files a C file includes, can be computed from source files. Where such information is not available, such as which C files should be linked together to form an executable, use configuration files to provide the information. The rule for linking can use these configuration files, which can be properly tracked. Moving any regularly changing configuration into separate files will significantly reduce the number of build system changes. _Depend on the build source:_ One approach is to depend on the build system source in each of the rules, then if _any_ rules change, _everything_ will rebuild. While this option is safe, it may cause a significant number of redundant rebuilds. As a restricted version of this technique, for a generated file you can include a dependency on the generator source and use `writeFileChanged`. If the generator changes it will rerun, but typically only a few generated files will change, so little is rebuilt. _Use a version stamp:_ There is a field named `shakeVersion` in the `ShakeOptions` record. If the build system changes in a significant and incompatible way, you can change this field to force a full rebuild. If you want all rules to depend on all rules, you can put a hash of the build system source in the version field, as [described here](http://stackoverflow.com/questions/18532552/shake-how-to-reliably-automatically-force-rebuild-when-my-rules-change-becomi/18532553#18532553). ## The Haskell Zone From now on, this manual assumes some moderate knowledge of Haskell. Most of the things in this section are either impossible to do with other build systems or can be faked by shell script. None of the Haskell is particularly advanced. #### Haskell Expressions You can use any Haskell function at any point. For example, to only link files without numbers in them, we can `import Data.Char` and then write: let os = ["_build" c -<.> "o" | c <- inputs, not $ any isDigit c] For defining non-overlapping rules it is sometimes useful to use a more advanced predicate. For example, to define a rule that only builds results which have a numeric extension, we can use the `?>` rule definition function: (\x -> all isDigit $ drop 1 $ takeExtension x) ?> \out -> do ... We first get the extension with `takeExtension`, then use `drop 1` to remove the leading `.` that `takeExtension` includes, then test that all the characters are numeric. The standard `%>` operator is actually defined as: pattern %> actions = (pattern ?==) ?> actions Where `?==` is a function for matching file patterns. #### Haskell Actions You can run any Haskell `IO` action by using `liftIO`. As an example: liftIO $ launchMissiles True Most common IO operations to run as actions are already wrapped and available in the Shake library, including `readFile'`, `writeFile'` and `copyFile'`. Other useful functions can be found in `System.Directory`. #### Include files with Visual Studio While `gcc` has the `-MMD` flag to generate a Makefile, the Visual Studio compiler `cl` does not. However, it does have a flag `-showincludes` which writes the include files on stdout as they are used. The initial example could be written using `cl` as: Stdout stdout <- cmd "cl -showincludes -c" [input] ["-Fo" ++ output] need [ dropWhile isSpace x | x <- lines stdout , Just x <- [stripPrefix "Note: including file:" x]] The `stripPrefix` function is available in the `Data.List` module. One warning: the "including file" message is localised, so if your developers are using non-English versions of Visual Studio the prefix string will be different #### Generated imports The initial example compiles the C file, then calls `need` on all its source and header files. This works fine if the header files are all source code. However, if any of the header files are _generated_ by the build system then when the compilation occurs they will not yet have been built. In general it is important to `need` any generated files _before_ they are used. To detect the included headers without using the compiler we can define `usedHeaders` as a top-level function: usedHeaders src = [init x | x <- lines src, Just x <- [stripPrefix "#include \"" x]] This function takes the source code of a C file (`src`) and finds all lines that begin `#include "`, then takes the filename afterwards. This function does not work for all C files, but for most projects it is usually easy to write such a function that covers everything allowed by your coding standards. Assuming all interesting headers are only included directly by the C file (a restriction we remove in the next section), we can write the build rule as: "_build//*.o" %> \out -> do let c = dropDirectory1 $ out -<.> "c" src <- readFile' c need $ usedHeaders src cmd "gcc -c" [c] "-o" [out] This code calls `readFile'` (which automatically calls `need` on the source file), then uses calls `need` on all headers used by the source file, then calls `gcc`. All files have `need` called on them before they are used, so if the C file or any of the header files have build system rules they will be run. #### Generated transitive imports The previous section described how to deal with generated include files, but only coped with headers included directly by the C file. This section describes how to extend that to work with generated headers used either in C or header files, even when used by headers that were themselves generated. We can write: ["*.c.dep","*.h.dep"] |%> \out -> do src <- readFile' $ dropExtension out writeFileLines out $ usedHeaders src "*.deps" %> \out -> do dep <- readFileLines $ out -<.> "dep" deps <- mapM (readFileLines . (<.> "deps")) dep writeFileLines out $ nub $ dropExtension out : concat deps "*.o" %> \out -> do deps <- readFileLines $ out -<.> "c.deps" need deps cmd "gcc -c" [dropExtension out] "-o" out For simplicity, this code assumes all files are in a single directory and all objects are generated files are placed in the same directory. We define three rules: * The `*.c.dep` and `*.h.dep` rule uses `|%>`, which defines a single action that matches multiple patterns. The file `foo.h.dep` contains a list of headers directly included by `foo.h`, using `usedHeaders` from the previous section. * The `*.deps` rule takes the transitive closure of dependencies, so `foo.h.deps` contains `foo.h` and all headers that `foo.h` pulls in. The rule takes the target file, and all the `.deps` for anything in the `.dep` file, and combines them. More abstractly, the rule calculates the transitive closure of _a_, namely _a*_, by taking the dependencies of _a_ (say _b_ and _c_) and computing _a\* = union(a, b\*, c\*)_. * The `*.o` rule reads the associated `.deps` file (ensuring it is up to date) and then depends on its contents. The pattern of `*.deps` files occurs frequently, for example when linking Haskell files. shake-0.15.5/docs/manual/0000755000000000000000000000000012560222036013266 5ustar0000000000000000shake-0.15.5/docs/manual/main.c0000644000000000000000000000014712560222036014360 0ustar0000000000000000#include #include "constants.h" int main() { printf("%s\n", message()); return 0; } shake-0.15.5/docs/manual/constants.h0000644000000000000000000000002112560222036015444 0ustar0000000000000000char* message(); shake-0.15.5/docs/manual/constants.c0000644000000000000000000000011112560222036015437 0ustar0000000000000000 char msg[] = "Hello Shake Users!"; char* message() { return msg; } shake-0.15.5/docs/manual/build.sh0000644000000000000000000000020012560222036014711 0ustar0000000000000000#!/bin/sh mkdir -p _shake ghc --make Build.hs -rtsopts -with-rtsopts=-I0 -outputdir=_shake -o _shake/build && _shake/build "$@" shake-0.15.5/docs/manual/Build.hs0000644000000000000000000000132012560222036014655 0ustar0000000000000000import Development.Shake import Development.Shake.Command import Development.Shake.FilePath import Development.Shake.Util main :: IO () main = shakeArgs shakeOptions{shakeFiles="_build"} $ do want ["_build/run" <.> exe] phony "clean" $ do putNormal "Cleaning files in _build" removeFilesAfter "_build" ["//*"] "_build/run" <.> exe %> \out -> do cs <- getDirectoryFiles "" ["//*.c"] let os = ["_build" c -<.> "o" | c <- cs] need os cmd "gcc -o" [out] os "_build//*.o" %> \out -> do let c = dropDirectory1 $ out -<.> "c" let m = out -<.> "m" () <- cmd "gcc -c" [c] "-o" [out] "-MMD -MF" [m] needMakefileDependencies m shake-0.15.5/docs/manual/build.bat0000755000000000000000000000017412560222036015062 0ustar0000000000000000@mkdir _shake 2> nul @ghc --make Build.hs -rtsopts -with-rtsopts=-I0 -outputdir=_shake -o _shake/build && _shake\build %*